Apr 16 13:59:24.868549 ip-10-0-128-60 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:59:25.331847 ip-10-0-128-60 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:25.331847 ip-10-0-128-60 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:59:25.331847 ip-10-0-128-60 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:25.331847 ip-10-0-128-60 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:59:25.331847 ip-10-0-128-60 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:25.333084 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.332648 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:59:25.335760 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335742 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:25.335760 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335759 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335763 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335767 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335770 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335773 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335776 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335779 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335782 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335787 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335791 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335795 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335798 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335808 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335811 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335814 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335817 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335820 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335822 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335825 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:25.335828 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335828 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335832 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335836 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335840 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335843 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335846 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335848 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335851 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335855 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335857 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335860 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335863 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335866 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335868 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335871 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335874 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335876 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335879 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335882 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:25.336297 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335884 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335887 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335889 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335892 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335894 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335897 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335900 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335903 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335905 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335908 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335911 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335913 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335915 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335918 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335921 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335924 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335927 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335929 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335932 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335934 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:25.336787 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335937 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335940 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335942 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335945 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335947 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335950 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335953 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335955 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335958 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335960 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335963 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335966 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335969 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335971 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335975 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335977 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335980 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335983 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335986 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335988 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:25.337281 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335993 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335996 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.335999 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336001 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336003 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336006 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336010 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336424 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336430 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336433 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336436 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336439 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336442 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336445 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336449 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336452 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336455 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336458 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336461 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:25.337765 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336464 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336466 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336469 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336472 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336475 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336477 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336480 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336482 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336485 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336488 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336490 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336493 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336496 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336499 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336502 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336504 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336507 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336509 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336512 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336516 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:25.338339 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336519 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336521 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336524 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336527 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336529 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336532 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336534 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336537 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336540 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336542 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336545 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336548 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336550 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336553 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336555 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336558 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336560 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336565 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336570 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:25.339117 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336573 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336576 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336579 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336581 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336584 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336587 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336589 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336592 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336595 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336598 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336600 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336603 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336606 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336608 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336611 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336613 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336616 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336618 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336621 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336624 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:25.339604 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336626 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336629 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336633 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336636 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336638 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336641 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336643 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336646 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336649 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336651 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336654 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336657 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336659 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336662 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.336665 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337451 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337462 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337469 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337474 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337478 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337483 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:59:25.340100 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337488 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337492 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337496 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337499 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337503 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337507 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337510 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337513 2569 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337516 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337519 2569 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337522 2569 flags.go:64] FLAG: --cloud-config="" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337525 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337528 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337533 2569 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337536 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337539 2569 flags.go:64] FLAG: --config-dir="" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337542 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337545 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337550 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337553 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337556 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337560 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337563 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337567 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:59:25.340633 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337570 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337573 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337576 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337580 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337583 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337586 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337589 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337592 2569 flags.go:64] FLAG: --enable-server="true" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337596 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337600 2569 flags.go:64] FLAG: --event-burst="100" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337604 2569 flags.go:64] FLAG: --event-qps="50" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337607 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337610 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337613 2569 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337617 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337620 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337623 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337627 2569 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337630 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337633 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337636 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337639 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337642 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337645 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337647 2569 flags.go:64] FLAG: --feature-gates="" Apr 16 13:59:25.341210 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337651 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337654 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337657 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337661 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337664 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337667 2569 flags.go:64] FLAG: --help="false" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337670 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337674 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337677 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337680 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337683 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337687 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337690 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337693 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337696 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337699 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337702 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337705 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337709 2569 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337712 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337715 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337718 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337721 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337724 2569 flags.go:64] FLAG: --lock-file="" Apr 16 13:59:25.341841 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337726 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337729 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337733 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337738 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337741 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337744 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337747 2569 flags.go:64] FLAG: --logging-format="text" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337750 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337753 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337756 2569 flags.go:64] FLAG: --manifest-url="" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337759 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337764 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337767 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337772 2569 flags.go:64] FLAG: --max-pods="110" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337776 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337779 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337782 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337785 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337788 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337791 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337794 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337802 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337805 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337809 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:59:25.342440 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337812 2569 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337815 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337822 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337825 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337828 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337831 2569 flags.go:64] FLAG: --port="10250" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337834 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337837 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f5ef628221da6bc0" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337840 2569 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337843 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337846 2569 flags.go:64] FLAG: --register-node="true" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337849 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337852 2569 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337856 2569 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337859 2569 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337862 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337864 2569 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337871 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337874 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337878 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337882 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337885 2569 flags.go:64] FLAG: --runonce="false" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337888 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337891 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337894 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:59:25.343033 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337897 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337900 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337903 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337906 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337909 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337913 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337915 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337918 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337921 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337925 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337929 2569 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337932 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337937 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337940 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337943 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337948 2569 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337951 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337954 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337957 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337960 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337963 2569 flags.go:64] FLAG: --v="2" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337967 2569 flags.go:64] FLAG: --version="false" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337971 2569 flags.go:64] FLAG: --vmodule="" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337975 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.337979 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:59:25.343638 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338367 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338387 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338393 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338398 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338403 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338410 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338415 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338420 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338425 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338436 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338441 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338445 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338451 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338456 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338461 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338466 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338470 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338475 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338480 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338484 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:25.344236 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338489 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338494 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338503 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338507 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338512 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338516 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338521 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338525 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338530 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338534 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338539 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338544 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338548 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338559 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338568 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338573 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338577 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338582 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338586 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:25.344798 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338593 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338600 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338606 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338611 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338616 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338620 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338625 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338634 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338638 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338643 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338647 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338652 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338656 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338660 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338665 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338670 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338674 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338679 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338683 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338688 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:25.345308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338696 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338701 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338705 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338709 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338714 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338718 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338727 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338732 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338737 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338742 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338748 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338754 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338765 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338770 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338774 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338779 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338783 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338787 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338791 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:25.345807 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338796 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:25.346308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338800 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:25.346308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338805 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:25.346308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338809 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:25.346308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338814 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:25.346308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338818 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:25.346308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338827 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:25.346308 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.338831 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:25.346308 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.339572 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:25.349774 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.349752 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:59:25.349774 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.349775 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349824 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349830 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349835 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349839 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349842 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349845 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349848 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349852 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349855 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349857 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349860 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349863 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349866 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349869 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349872 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349874 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349877 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349880 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349883 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:25.349877 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349886 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349890 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349893 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349896 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349898 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349901 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349903 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349906 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349909 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349911 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349914 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349917 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349919 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349924 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349928 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349931 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349934 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349936 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349938 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349941 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:25.350501 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349944 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349947 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349949 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349952 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349955 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349958 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349961 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349964 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349966 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349969 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349971 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349974 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349977 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349979 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349983 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349986 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349989 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349992 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349994 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349996 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:25.350990 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.349999 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350001 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350004 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350007 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350010 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350012 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350015 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350017 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350020 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350023 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350025 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350028 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350030 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350033 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350036 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350038 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350041 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350043 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350047 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350049 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:25.351565 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350052 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:25.352053 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350054 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:25.352053 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350057 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:25.352053 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350059 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:25.352053 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350062 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:25.352053 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350066 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:25.352053 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350070 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:25.352053 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.350075 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:25.352053 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350189 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:25.352053 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350194 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:25.352053 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350197 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:25.352053 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350200 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:25.352053 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350203 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:25.352053 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350205 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:25.352053 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350208 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:25.352053 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350211 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350213 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350216 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350218 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350221 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350224 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350226 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350229 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350232 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350234 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350237 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350240 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350243 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350245 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350264 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350269 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350273 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350276 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350279 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350282 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:25.352449 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350284 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350287 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350290 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350292 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350295 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350298 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350302 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350305 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350308 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350310 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350313 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350316 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350319 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350321 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350324 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350326 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350329 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350332 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350335 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350337 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:25.352946 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350340 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350342 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350345 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350348 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350350 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350354 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350358 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350361 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350365 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350369 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350372 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350375 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350378 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350380 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350383 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350385 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350388 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350391 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:25.353475 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350394 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350397 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350400 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350403 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350405 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350408 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350410 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350413 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350415 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350418 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350421 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350423 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350426 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350428 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350431 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350433 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350436 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350438 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350441 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350444 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:25.353915 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:25.350447 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:25.354425 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.350452 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:25.354425 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.351152 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:59:25.354425 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.353286 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:59:25.354518 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.354511 2569 server.go:1019] "Starting client certificate rotation" Apr 16 13:59:25.354624 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.354606 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:25.354653 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.354648 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:25.380225 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.380200 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:25.384382 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.384363 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:25.402350 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.402328 2569 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:59:25.407353 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.407334 2569 log.go:25] "Validated CRI v1 image API" Apr 16 13:59:25.408562 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.408545 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:59:25.408847 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.408826 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:25.411191 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.411172 2569 fs.go:135] Filesystem UUIDs: map[66e7e407-3c0a-4823-a23c-b4887d8989e7:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 ec41db89-d6d6-41f2-89ea-aaa47b675d0e:/dev/nvme0n1p4] Apr 16 13:59:25.411245 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.411190 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:59:25.416326 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.416202 2569 manager.go:217] Machine: {Timestamp:2026-04-16 13:59:25.414960019 +0000 UTC m=+0.423196285 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100888 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2eaabcb4306c6901c422948aba6bb0 SystemUUID:ec2eaabc-b430-6c69-01c4-22948aba6bb0 BootID:579e6e2d-8fa4-43c5-a8f7-a7651a21c967 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:cf:5f:f7:94:af Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:cf:5f:f7:94:af Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2a:e4:a2:e9:c9:19 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:59:25.416326 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.416322 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:59:25.416431 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.416404 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:59:25.417701 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.417675 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:59:25.417864 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.417703 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-60.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:59:25.417954 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.417879 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:59:25.417954 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.417892 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:59:25.417954 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.417911 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:25.418835 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.418822 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:25.420284 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.420272 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:25.420615 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.420602 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:59:25.423222 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.423210 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:59:25.423309 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.423228 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:59:25.423309 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.423246 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:59:25.423309 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.423274 2569 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:59:25.423309 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.423287 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:59:25.424391 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.424377 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:25.424458 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.424400 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:25.427540 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.427522 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:59:25.429237 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.429220 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:59:25.430762 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.430748 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:59:25.430827 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.430770 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:59:25.430827 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.430779 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:59:25.430827 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.430787 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:59:25.430827 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.430797 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:59:25.430827 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.430806 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:59:25.430827 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.430813 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:59:25.430827 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.430819 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:59:25.430827 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.430826 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:59:25.431159 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.430859 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:59:25.431159 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.430876 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:59:25.431159 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.430885 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:59:25.431371 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.431337 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-plrhq" Apr 16 13:59:25.431944 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.431932 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:59:25.431944 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.431943 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:59:25.433557 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:25.433525 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:59:25.433557 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:25.433549 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-60.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:59:25.435943 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.435929 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:59:25.436061 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.436050 2569 server.go:1295] "Started kubelet" Apr 16 13:59:25.436143 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.436111 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:59:25.436235 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.436194 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:59:25.436311 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.436276 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:59:25.436847 ip-10-0-128-60 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:59:25.437143 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.437120 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-60.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 13:59:25.440547 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.440476 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:59:25.440751 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.440732 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-plrhq" Apr 16 13:59:25.443216 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.443196 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:59:25.446379 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.446362 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:59:25.446480 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.446386 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:25.447185 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.447164 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:59:25.447185 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.447169 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:59:25.447327 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.447199 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:59:25.447327 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.447217 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:59:25.447327 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.447238 2569 factory.go:55] Registering systemd factory Apr 16 13:59:25.447327 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.447265 2569 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:59:25.447506 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:25.447364 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-60.ec2.internal\" not found" Apr 16 13:59:25.447506 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.447400 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:59:25.447506 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.447410 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:59:25.447506 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.447452 2569 factory.go:153] Registering CRI-O factory Apr 16 13:59:25.447506 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.447463 2569 factory.go:223] Registration of the crio container factory successfully Apr 16 13:59:25.447506 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.447485 2569 factory.go:103] Registering Raw factory Apr 16 13:59:25.447506 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.447499 2569 manager.go:1196] Started watching for new ooms in manager Apr 16 13:59:25.447847 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.447830 2569 manager.go:319] Starting recovery of all containers Apr 16 13:59:25.452067 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.452045 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:25.454943 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:25.454661 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-60.ec2.internal\" not found" node="ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.457430 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.457240 2569 manager.go:324] Recovery completed Apr 16 13:59:25.462434 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.462421 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:25.464790 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.464774 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:25.464842 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.464809 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:25.464842 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.464824 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:25.465283 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.465265 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:59:25.465283 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.465282 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:59:25.465379 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.465301 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:25.467694 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.467681 2569 policy_none.go:49] "None policy: Start" Apr 16 13:59:25.467749 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.467698 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:59:25.467749 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.467709 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:59:25.503949 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.503931 2569 manager.go:341] "Starting Device Plugin manager" Apr 16 13:59:25.504063 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:25.503962 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:59:25.504063 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.503973 2569 server.go:85] "Starting device plugin registration server" Apr 16 13:59:25.504246 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.504222 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:59:25.504354 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.504245 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:59:25.504438 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.504416 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:59:25.504886 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.504509 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:59:25.504886 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.504526 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:59:25.505169 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:25.505149 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:59:25.505169 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:25.505191 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-60.ec2.internal\" not found" Apr 16 13:59:25.596738 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.596664 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:59:25.598063 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.598044 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:59:25.598177 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.598084 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:59:25.598177 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.598109 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:59:25.598177 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.598116 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:59:25.598177 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:25.598153 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:59:25.602366 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.602347 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:25.605150 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.605137 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:25.606153 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.606134 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:25.606231 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.606167 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:25.606231 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.606177 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:25.606231 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.606202 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.612453 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.612438 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.612496 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:25.612461 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-60.ec2.internal\": node \"ip-10-0-128-60.ec2.internal\" not found" Apr 16 13:59:25.631739 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:25.631706 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-60.ec2.internal\" not found" Apr 16 13:59:25.698650 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.698598 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-60.ec2.internal"] Apr 16 13:59:25.698805 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.698691 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:25.699655 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.699640 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:25.699706 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.699669 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:25.699706 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.699679 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:25.701071 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.701059 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:25.701204 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.701182 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.701240 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.701220 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:25.701806 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.701788 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:25.701905 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.701820 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:25.701905 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.701832 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:25.701905 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.701788 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:25.701905 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.701886 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:25.701905 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.701898 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:25.703683 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.703667 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.703757 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.703698 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:25.704439 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.704419 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:25.704493 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.704462 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:25.704493 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.704477 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:25.726981 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:25.726956 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-60.ec2.internal\" not found" node="ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.731466 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:25.731450 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-60.ec2.internal\" not found" node="ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.732424 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:25.732409 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-60.ec2.internal\" not found" Apr 16 13:59:25.748655 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.748633 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/98a568f4042cac40af0eea82d9e64973-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal\" (UID: \"98a568f4042cac40af0eea82d9e64973\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.748739 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.748658 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/98a568f4042cac40af0eea82d9e64973-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal\" (UID: \"98a568f4042cac40af0eea82d9e64973\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.748739 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.748678 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0046dd3af4f2ad1568ea51124d053499-config\") pod \"kube-apiserver-proxy-ip-10-0-128-60.ec2.internal\" (UID: \"0046dd3af4f2ad1568ea51124d053499\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.833160 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:25.833129 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-60.ec2.internal\" not found" Apr 16 13:59:25.848942 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.848888 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/98a568f4042cac40af0eea82d9e64973-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal\" (UID: \"98a568f4042cac40af0eea82d9e64973\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.848942 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.848916 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/98a568f4042cac40af0eea82d9e64973-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal\" (UID: \"98a568f4042cac40af0eea82d9e64973\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.848942 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.848936 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0046dd3af4f2ad1568ea51124d053499-config\") pod \"kube-apiserver-proxy-ip-10-0-128-60.ec2.internal\" (UID: \"0046dd3af4f2ad1568ea51124d053499\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.849113 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.848961 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0046dd3af4f2ad1568ea51124d053499-config\") pod \"kube-apiserver-proxy-ip-10-0-128-60.ec2.internal\" (UID: \"0046dd3af4f2ad1568ea51124d053499\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.849113 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.848983 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/98a568f4042cac40af0eea82d9e64973-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal\" (UID: \"98a568f4042cac40af0eea82d9e64973\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.849113 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:25.849001 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/98a568f4042cac40af0eea82d9e64973-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal\" (UID: \"98a568f4042cac40af0eea82d9e64973\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal" Apr 16 13:59:25.933529 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:25.933478 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-60.ec2.internal\" not found" Apr 16 13:59:26.031028 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.030997 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal" Apr 16 13:59:26.034363 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:26.033746 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-60.ec2.internal\" not found" Apr 16 13:59:26.034866 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.034850 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-60.ec2.internal" Apr 16 13:59:26.134539 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:26.134434 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-60.ec2.internal\" not found" Apr 16 13:59:26.234950 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:26.234921 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-60.ec2.internal\" not found" Apr 16 13:59:26.252769 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.252739 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:26.335526 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:26.335481 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-60.ec2.internal\" not found" Apr 16 13:59:26.353880 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.353858 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:59:26.354032 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.354010 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:26.354087 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.354022 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:26.354087 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.354042 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:26.436524 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:26.436456 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-60.ec2.internal\" not found" Apr 16 13:59:26.444581 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.444541 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:54:25 +0000 UTC" deadline="2027-11-14 11:49:20.892912939 +0000 UTC" Apr 16 13:59:26.444581 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.444579 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13845h49m54.448337675s" Apr 16 13:59:26.446851 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.446833 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:26.459938 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.459910 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:26.479453 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.479425 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-65hhp" Apr 16 13:59:26.487670 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.487641 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-65hhp" Apr 16 13:59:26.529571 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.529553 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:59:26.536992 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:26.536972 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-60.ec2.internal\" not found" Apr 16 13:59:26.601447 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.601401 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal" event={"ID":"98a568f4042cac40af0eea82d9e64973","Type":"ContainerStarted","Data":"30be144d1befcba29c8ab4f33945d5051dccd6b76c603edc9e9e2e9d93337855"} Apr 16 13:59:26.637626 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:26.637571 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-60.ec2.internal\" not found" Apr 16 13:59:26.706887 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.706818 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:26.711927 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:26.711891 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0046dd3af4f2ad1568ea51124d053499.slice/crio-521b4de301d626ee0e6c30f25f68f94a59b9779cd6c4488a1a45c30e7593b557 WatchSource:0}: Error finding container 521b4de301d626ee0e6c30f25f68f94a59b9779cd6c4488a1a45c30e7593b557: Status 404 returned error can't find the container with id 521b4de301d626ee0e6c30f25f68f94a59b9779cd6c4488a1a45c30e7593b557 Apr 16 13:59:26.746820 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.746795 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal" Apr 16 13:59:26.758884 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.758859 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:26.759742 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.759726 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-60.ec2.internal" Apr 16 13:59:26.767385 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:26.767369 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:27.424787 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.424557 2569 apiserver.go:52] "Watching apiserver" Apr 16 13:59:27.440036 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.440007 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:59:27.442301 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.442020 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-ljdss","openshift-ovn-kubernetes/ovnkube-node-jdtzv","kube-system/konnectivity-agent-9wmsd","kube-system/kube-apiserver-proxy-ip-10-0-128-60.ec2.internal","openshift-dns/node-resolver-4wsvh","openshift-multus/network-metrics-daemon-mkz26","openshift-network-diagnostics/network-check-target-ws5bp","openshift-network-operator/iptables-alerter-5tn2l","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj","openshift-cluster-node-tuning-operator/tuned-hlkp4","openshift-image-registry/node-ca-w582s","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal","openshift-multus/multus-7wc6j"] Apr 16 13:59:27.444354 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.444328 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.446422 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.446392 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.447181 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.447159 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:27.447391 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.447364 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:27.447767 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.447643 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-27xcp\"" Apr 16 13:59:27.450446 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.450300 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:59:27.450446 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.450375 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:59:27.450613 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.450595 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:59:27.450706 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.450674 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-6q6qc\"" Apr 16 13:59:27.450766 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.450704 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:59:27.450766 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.450705 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:59:27.450865 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.450815 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4wsvh" Apr 16 13:59:27.450914 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.450820 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:59:27.453567 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.453327 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:27.453567 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:27.453401 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:27.454180 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.454150 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:59:27.454180 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.454172 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-g9hw5\"" Apr 16 13:59:27.454347 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.454300 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:59:27.455494 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.455473 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9wmsd" Apr 16 13:59:27.456695 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.456673 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-var-lib-openvswitch\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.456791 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.456713 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-run-openvswitch\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.456791 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.456740 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-node-log\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.456791 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.456764 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-systemd\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.456944 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.456789 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-var-lib-kubelet\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.456944 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.456812 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/832755a8-0ca3-4291-86b7-728e462384ee-tmp\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.456944 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.456835 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-slash\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.456944 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.456874 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-run-systemd\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.456944 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.456899 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-sysconfig\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.456944 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.456928 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-sysctl-d\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.457186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.456951 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-sysctl-conf\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.457186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.456975 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-systemd-units\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.457186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.456999 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-run-netns\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.457186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457025 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-etc-openvswitch\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.457186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457057 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.457186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457082 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/55dc093f-e774-41c5-a0c2-2eaa10a6e366-agent-certs\") pod \"konnectivity-agent-9wmsd\" (UID: \"55dc093f-e774-41c5-a0c2-2eaa10a6e366\") " pod="kube-system/konnectivity-agent-9wmsd" Apr 16 13:59:27.457186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457104 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-cni-bin\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.457186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457127 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6dcff5a1-e62a-4c95-9278-292e6b914e02-ovnkube-config\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.457186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457149 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6dcff5a1-e62a-4c95-9278-292e6b914e02-ovnkube-script-lib\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.457186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457173 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/55dc093f-e774-41c5-a0c2-2eaa10a6e366-konnectivity-ca\") pod \"konnectivity-agent-9wmsd\" (UID: \"55dc093f-e774-41c5-a0c2-2eaa10a6e366\") " pod="kube-system/konnectivity-agent-9wmsd" Apr 16 13:59:27.457729 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457237 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/832755a8-0ca3-4291-86b7-728e462384ee-etc-tuned\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.457729 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457282 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6dcff5a1-e62a-4c95-9278-292e6b914e02-env-overrides\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.457729 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457310 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-lib-modules\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.457729 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457336 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-modprobe-d\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.457729 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457428 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-kubernetes\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.457729 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457538 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-sys\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.457729 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457613 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-log-socket\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.457729 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457649 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-run-ovn-kubernetes\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.457729 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457682 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-cni-netd\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.457729 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457705 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-run\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.458186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457734 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nblr4\" (UniqueName: \"kubernetes.io/projected/832755a8-0ca3-4291-86b7-728e462384ee-kube-api-access-nblr4\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.458186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457758 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-kubelet\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.458186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457781 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-run-ovn\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.458186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457804 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6dcff5a1-e62a-4c95-9278-292e6b914e02-ovn-node-metrics-cert\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.458186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457835 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4rnb\" (UniqueName: \"kubernetes.io/projected/6dcff5a1-e62a-4c95-9278-292e6b914e02-kube-api-access-z4rnb\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.458186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457860 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-host\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.458186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457885 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-hbb7x\"" Apr 16 13:59:27.458186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.457926 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:59:27.458186 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.458028 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:59:27.458605 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.458194 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:27.458605 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:27.458279 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:27.461221 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.460582 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5tn2l" Apr 16 13:59:27.461221 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.460877 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.463003 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.462866 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9knzn\"" Apr 16 13:59:27.463218 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.463025 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:59:27.463332 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.463226 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:27.463332 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.463313 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:27.463507 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.463338 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:59:27.465004 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.464985 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jt6pm\"" Apr 16 13:59:27.465324 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.465226 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:59:27.465443 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.465356 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:59:27.465585 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.465568 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w582s" Apr 16 13:59:27.466190 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.466170 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.470437 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.469219 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:59:27.470437 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.469371 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:59:27.470437 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.469470 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:59:27.470437 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.469731 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:59:27.470437 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.469805 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:59:27.470437 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.469971 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:59:27.470752 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.470577 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:59:27.471341 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.471000 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-d49wl\"" Apr 16 13:59:27.472960 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.472759 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:59:27.472960 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.472924 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.473113 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.473046 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-g24vs\"" Apr 16 13:59:27.475347 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.475310 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:59:27.475347 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.475335 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7c6ww\"" Apr 16 13:59:27.488975 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.488947 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:26 +0000 UTC" deadline="2027-09-11 06:33:07.23720943 +0000 UTC" Apr 16 13:59:27.488975 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.488973 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12304h33m39.748238398s" Apr 16 13:59:27.548468 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.548439 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:59:27.558924 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.558667 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-kubelet\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.558924 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.558707 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-host\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.558924 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.558742 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-registration-dir\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.558924 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.558759 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-kubelet\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.558924 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.558799 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-cni-binary-copy\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.558924 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.558828 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-multus-socket-dir-parent\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.558924 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.558838 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-host\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.558924 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.558852 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-var-lib-cni-multus\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.558924 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.558909 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/88319ece-75ee-4ddb-b42a-2a26963cba92-cnibin\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.558942 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hld7b\" (UniqueName: \"kubernetes.io/projected/d04a5bf9-7e36-4375-aad1-26af61c2c344-kube-api-access-hld7b\") pod \"node-ca-w582s\" (UID: \"d04a5bf9-7e36-4375-aad1-26af61c2c344\") " pod="openshift-image-registry/node-ca-w582s" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.558972 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-hostroot\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559010 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-slash\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559035 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-sysconfig\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559058 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-sysctl-d\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559083 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-etc-kubernetes\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559109 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b2bcf95-19e8-4acb-8c03-e7b4322a90e1-host-slash\") pod \"iptables-alerter-5tn2l\" (UID: \"4b2bcf95-19e8-4acb-8c03-e7b4322a90e1\") " pod="openshift-network-operator/iptables-alerter-5tn2l" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559129 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d04a5bf9-7e36-4375-aad1-26af61c2c344-host\") pod \"node-ca-w582s\" (UID: \"d04a5bf9-7e36-4375-aad1-26af61c2c344\") " pod="openshift-image-registry/node-ca-w582s" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559181 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d04a5bf9-7e36-4375-aad1-26af61c2c344-serviceca\") pod \"node-ca-w582s\" (UID: \"d04a5bf9-7e36-4375-aad1-26af61c2c344\") " pod="openshift-image-registry/node-ca-w582s" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559198 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-run-netns\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559212 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-etc-openvswitch\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559240 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559285 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/55dc093f-e774-41c5-a0c2-2eaa10a6e366-agent-certs\") pod \"konnectivity-agent-9wmsd\" (UID: \"55dc093f-e774-41c5-a0c2-2eaa10a6e366\") " pod="kube-system/konnectivity-agent-9wmsd" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559311 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-etc-selinux\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559338 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjz8b\" (UniqueName: \"kubernetes.io/projected/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-kube-api-access-tjz8b\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559368 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88319ece-75ee-4ddb-b42a-2a26963cba92-system-cni-dir\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.559498 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559395 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-cni-bin\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559428 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-run-k8s-cni-cncf-io\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559453 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4b2bcf95-19e8-4acb-8c03-e7b4322a90e1-iptables-alerter-script\") pod \"iptables-alerter-5tn2l\" (UID: \"4b2bcf95-19e8-4acb-8c03-e7b4322a90e1\") " pod="openshift-network-operator/iptables-alerter-5tn2l" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559484 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/88319ece-75ee-4ddb-b42a-2a26963cba92-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559518 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6dcff5a1-e62a-4c95-9278-292e6b914e02-env-overrides\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559544 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-lib-modules\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559571 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b248dbae-841e-4eb7-a41e-cc738673d882-hosts-file\") pod \"node-resolver-4wsvh\" (UID: \"b248dbae-841e-4eb7-a41e-cc738673d882\") " pod="openshift-dns/node-resolver-4wsvh" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559595 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b248dbae-841e-4eb7-a41e-cc738673d882-tmp-dir\") pod \"node-resolver-4wsvh\" (UID: \"b248dbae-841e-4eb7-a41e-cc738673d882\") " pod="openshift-dns/node-resolver-4wsvh" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559653 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm6nl\" (UniqueName: \"kubernetes.io/projected/c449dabf-b9f5-4136-b598-074040f02629-kube-api-access-nm6nl\") pod \"network-metrics-daemon-mkz26\" (UID: \"c449dabf-b9f5-4136-b598-074040f02629\") " pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559677 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559713 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-modprobe-d\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559739 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/88319ece-75ee-4ddb-b42a-2a26963cba92-os-release\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559772 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-run-ovn-kubernetes\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559796 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-cni-netd\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559819 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-run\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559844 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-var-lib-kubelet\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559870 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-multus-conf-dir\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.560290 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559899 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-run-ovn\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559918 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6dcff5a1-e62a-4c95-9278-292e6b914e02-ovn-node-metrics-cert\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559941 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4rnb\" (UniqueName: \"kubernetes.io/projected/6dcff5a1-e62a-4c95-9278-292e6b914e02-kube-api-access-z4rnb\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559957 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-run-netns\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559972 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-multus-daemon-config\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.559991 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/88319ece-75ee-4ddb-b42a-2a26963cba92-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560026 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-var-lib-openvswitch\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560079 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-run-openvswitch\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560118 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-node-log\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560150 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-systemd\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560179 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-var-lib-kubelet\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560205 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/832755a8-0ca3-4291-86b7-728e462384ee-tmp\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560270 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-socket-dir\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560300 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-run-systemd\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560326 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-sysctl-conf\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560352 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjwn8\" (UniqueName: \"kubernetes.io/projected/b248dbae-841e-4eb7-a41e-cc738673d882-kube-api-access-cjwn8\") pod \"node-resolver-4wsvh\" (UID: \"b248dbae-841e-4eb7-a41e-cc738673d882\") " pod="openshift-dns/node-resolver-4wsvh" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560385 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs\") pod \"network-metrics-daemon-mkz26\" (UID: \"c449dabf-b9f5-4136-b598-074040f02629\") " pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:27.561023 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560411 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7nh8\" (UniqueName: \"kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8\") pod \"network-check-target-ws5bp\" (UID: \"e4325a5a-3a6c-429b-a7f3-5a19918e6fd0\") " pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560462 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-cnibin\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560488 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-run-multus-certs\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560506 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-slash\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560591 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-run-ovn-kubernetes\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560643 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-modprobe-d\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560681 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-cni-netd\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560696 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-systemd\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560748 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-run\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560765 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-cni-bin\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560860 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-var-lib-kubelet\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561091 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-run-systemd\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561126 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561265 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-sysctl-conf\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561375 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6dcff5a1-e62a-4c95-9278-292e6b914e02-env-overrides\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561492 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-run-ovn\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561488 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-sysctl-d\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561526 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-lib-modules\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.561763 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.560513 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/88319ece-75ee-4ddb-b42a-2a26963cba92-cni-binary-copy\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561573 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-run-openvswitch\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561599 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-systemd-units\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561633 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-device-dir\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561658 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-sys-fs\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561700 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-system-cni-dir\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561730 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2m42\" (UniqueName: \"kubernetes.io/projected/88319ece-75ee-4ddb-b42a-2a26963cba92-kube-api-access-w2m42\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561769 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6dcff5a1-e62a-4c95-9278-292e6b914e02-ovnkube-config\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561828 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-sysconfig\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561866 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-var-lib-openvswitch\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561901 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-systemd-units\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.561948 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6dcff5a1-e62a-4c95-9278-292e6b914e02-ovnkube-script-lib\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562017 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/55dc093f-e774-41c5-a0c2-2eaa10a6e366-konnectivity-ca\") pod \"konnectivity-agent-9wmsd\" (UID: \"55dc093f-e774-41c5-a0c2-2eaa10a6e366\") " pod="kube-system/konnectivity-agent-9wmsd" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562060 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/832755a8-0ca3-4291-86b7-728e462384ee-etc-tuned\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562103 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5brx\" (UniqueName: \"kubernetes.io/projected/7a859253-7bd8-487d-9ee8-7b85cb9cb528-kube-api-access-g5brx\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562142 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-os-release\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562171 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/88319ece-75ee-4ddb-b42a-2a26963cba92-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.562642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562241 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6dcff5a1-e62a-4c95-9278-292e6b914e02-ovnkube-config\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562269 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-multus-cni-dir\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562325 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-var-lib-cni-bin\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562355 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrgmz\" (UniqueName: \"kubernetes.io/projected/4b2bcf95-19e8-4acb-8c03-e7b4322a90e1-kube-api-access-rrgmz\") pod \"iptables-alerter-5tn2l\" (UID: \"4b2bcf95-19e8-4acb-8c03-e7b4322a90e1\") " pod="openshift-network-operator/iptables-alerter-5tn2l" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562385 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-kubernetes\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562434 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-sys\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562437 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6dcff5a1-e62a-4c95-9278-292e6b914e02-ovnkube-script-lib\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562460 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-log-socket\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562485 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nblr4\" (UniqueName: \"kubernetes.io/projected/832755a8-0ca3-4291-86b7-728e462384ee-kube-api-access-nblr4\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562501 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-etc-kubernetes\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562555 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-log-socket\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562597 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/832755a8-0ca3-4291-86b7-728e462384ee-sys\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562738 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/55dc093f-e774-41c5-a0c2-2eaa10a6e366-konnectivity-ca\") pod \"konnectivity-agent-9wmsd\" (UID: \"55dc093f-e774-41c5-a0c2-2eaa10a6e366\") " pod="kube-system/konnectivity-agent-9wmsd" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562763 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-etc-openvswitch\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562790 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-node-log\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562790 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.563442 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.562822 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6dcff5a1-e62a-4c95-9278-292e6b914e02-host-run-netns\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.566393 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.566333 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/832755a8-0ca3-4291-86b7-728e462384ee-etc-tuned\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.567022 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.566998 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/55dc093f-e774-41c5-a0c2-2eaa10a6e366-agent-certs\") pod \"konnectivity-agent-9wmsd\" (UID: \"55dc093f-e774-41c5-a0c2-2eaa10a6e366\") " pod="kube-system/konnectivity-agent-9wmsd" Apr 16 13:59:27.567721 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.567686 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6dcff5a1-e62a-4c95-9278-292e6b914e02-ovn-node-metrics-cert\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.568896 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.568872 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/832755a8-0ca3-4291-86b7-728e462384ee-tmp\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.571174 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.571151 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nblr4\" (UniqueName: \"kubernetes.io/projected/832755a8-0ca3-4291-86b7-728e462384ee-kube-api-access-nblr4\") pod \"tuned-hlkp4\" (UID: \"832755a8-0ca3-4291-86b7-728e462384ee\") " pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.571601 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.571583 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4rnb\" (UniqueName: \"kubernetes.io/projected/6dcff5a1-e62a-4c95-9278-292e6b914e02-kube-api-access-z4rnb\") pod \"ovnkube-node-jdtzv\" (UID: \"6dcff5a1-e62a-4c95-9278-292e6b914e02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.578238 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.578209 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:27.601013 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.600984 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:27.603900 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.603864 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-60.ec2.internal" event={"ID":"0046dd3af4f2ad1568ea51124d053499","Type":"ContainerStarted","Data":"521b4de301d626ee0e6c30f25f68f94a59b9779cd6c4488a1a45c30e7593b557"} Apr 16 13:59:27.663212 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663173 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5brx\" (UniqueName: \"kubernetes.io/projected/7a859253-7bd8-487d-9ee8-7b85cb9cb528-kube-api-access-g5brx\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.663408 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663230 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-os-release\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.663408 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663274 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/88319ece-75ee-4ddb-b42a-2a26963cba92-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.663408 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663299 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-multus-cni-dir\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.663408 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663322 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-var-lib-cni-bin\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.663408 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663347 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrgmz\" (UniqueName: \"kubernetes.io/projected/4b2bcf95-19e8-4acb-8c03-e7b4322a90e1-kube-api-access-rrgmz\") pod \"iptables-alerter-5tn2l\" (UID: \"4b2bcf95-19e8-4acb-8c03-e7b4322a90e1\") " pod="openshift-network-operator/iptables-alerter-5tn2l" Apr 16 13:59:27.663408 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663379 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-registration-dir\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.663408 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663404 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-cni-binary-copy\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.663755 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663428 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-multus-socket-dir-parent\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.663755 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663455 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-var-lib-cni-multus\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.663755 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663478 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/88319ece-75ee-4ddb-b42a-2a26963cba92-cnibin\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.663755 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663537 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/88319ece-75ee-4ddb-b42a-2a26963cba92-cnibin\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.663755 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663629 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-registration-dir\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.664032 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663843 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-multus-cni-dir\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.664032 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663890 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-var-lib-cni-bin\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.664032 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663920 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hld7b\" (UniqueName: \"kubernetes.io/projected/d04a5bf9-7e36-4375-aad1-26af61c2c344-kube-api-access-hld7b\") pod \"node-ca-w582s\" (UID: \"d04a5bf9-7e36-4375-aad1-26af61c2c344\") " pod="openshift-image-registry/node-ca-w582s" Apr 16 13:59:27.664032 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663946 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-hostroot\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.664032 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663971 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-etc-kubernetes\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.664032 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.663990 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b2bcf95-19e8-4acb-8c03-e7b4322a90e1-host-slash\") pod \"iptables-alerter-5tn2l\" (UID: \"4b2bcf95-19e8-4acb-8c03-e7b4322a90e1\") " pod="openshift-network-operator/iptables-alerter-5tn2l" Apr 16 13:59:27.664032 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664004 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d04a5bf9-7e36-4375-aad1-26af61c2c344-host\") pod \"node-ca-w582s\" (UID: \"d04a5bf9-7e36-4375-aad1-26af61c2c344\") " pod="openshift-image-registry/node-ca-w582s" Apr 16 13:59:27.664032 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664024 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d04a5bf9-7e36-4375-aad1-26af61c2c344-serviceca\") pod \"node-ca-w582s\" (UID: \"d04a5bf9-7e36-4375-aad1-26af61c2c344\") " pod="openshift-image-registry/node-ca-w582s" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664051 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-etc-selinux\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664080 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjz8b\" (UniqueName: \"kubernetes.io/projected/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-kube-api-access-tjz8b\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664100 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88319ece-75ee-4ddb-b42a-2a26963cba92-system-cni-dir\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664124 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-run-k8s-cni-cncf-io\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664139 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4b2bcf95-19e8-4acb-8c03-e7b4322a90e1-iptables-alerter-script\") pod \"iptables-alerter-5tn2l\" (UID: \"4b2bcf95-19e8-4acb-8c03-e7b4322a90e1\") " pod="openshift-network-operator/iptables-alerter-5tn2l" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664159 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/88319ece-75ee-4ddb-b42a-2a26963cba92-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b248dbae-841e-4eb7-a41e-cc738673d882-hosts-file\") pod \"node-resolver-4wsvh\" (UID: \"b248dbae-841e-4eb7-a41e-cc738673d882\") " pod="openshift-dns/node-resolver-4wsvh" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664209 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b248dbae-841e-4eb7-a41e-cc738673d882-tmp-dir\") pod \"node-resolver-4wsvh\" (UID: \"b248dbae-841e-4eb7-a41e-cc738673d882\") " pod="openshift-dns/node-resolver-4wsvh" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664218 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-os-release\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664231 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nm6nl\" (UniqueName: \"kubernetes.io/projected/c449dabf-b9f5-4136-b598-074040f02629-kube-api-access-nm6nl\") pod \"network-metrics-daemon-mkz26\" (UID: \"c449dabf-b9f5-4136-b598-074040f02629\") " pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664276 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664296 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-multus-socket-dir-parent\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664305 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/88319ece-75ee-4ddb-b42a-2a26963cba92-os-release\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664302 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/88319ece-75ee-4ddb-b42a-2a26963cba92-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664344 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-var-lib-cni-multus\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664352 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-hostroot\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.664424 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664364 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-run-k8s-cni-cncf-io\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664384 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-var-lib-kubelet\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664396 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88319ece-75ee-4ddb-b42a-2a26963cba92-system-cni-dir\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664411 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-multus-conf-dir\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664438 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-etc-kubernetes\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664441 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-run-netns\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664473 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-multus-daemon-config\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664502 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/88319ece-75ee-4ddb-b42a-2a26963cba92-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664536 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-multus-conf-dir\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664539 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-socket-dir\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664569 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjwn8\" (UniqueName: \"kubernetes.io/projected/b248dbae-841e-4eb7-a41e-cc738673d882-kube-api-access-cjwn8\") pod \"node-resolver-4wsvh\" (UID: \"b248dbae-841e-4eb7-a41e-cc738673d882\") " pod="openshift-dns/node-resolver-4wsvh" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664594 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs\") pod \"network-metrics-daemon-mkz26\" (UID: \"c449dabf-b9f5-4136-b598-074040f02629\") " pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664622 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7nh8\" (UniqueName: \"kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8\") pod \"network-check-target-ws5bp\" (UID: \"e4325a5a-3a6c-429b-a7f3-5a19918e6fd0\") " pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664647 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-cnibin\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664160 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-cni-binary-copy\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664474 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-run-netns\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664671 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-run-multus-certs\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.665058 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664701 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-run-multus-certs\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664733 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/88319ece-75ee-4ddb-b42a-2a26963cba92-cni-binary-copy\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664759 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b248dbae-841e-4eb7-a41e-cc738673d882-hosts-file\") pod \"node-resolver-4wsvh\" (UID: \"b248dbae-841e-4eb7-a41e-cc738673d882\") " pod="openshift-dns/node-resolver-4wsvh" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664764 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d04a5bf9-7e36-4375-aad1-26af61c2c344-host\") pod \"node-ca-w582s\" (UID: \"d04a5bf9-7e36-4375-aad1-26af61c2c344\") " pod="openshift-image-registry/node-ca-w582s" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-device-dir\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664822 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b248dbae-841e-4eb7-a41e-cc738673d882-tmp-dir\") pod \"node-resolver-4wsvh\" (UID: \"b248dbae-841e-4eb7-a41e-cc738673d882\") " pod="openshift-dns/node-resolver-4wsvh" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664834 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4b2bcf95-19e8-4acb-8c03-e7b4322a90e1-iptables-alerter-script\") pod \"iptables-alerter-5tn2l\" (UID: \"4b2bcf95-19e8-4acb-8c03-e7b4322a90e1\") " pod="openshift-network-operator/iptables-alerter-5tn2l" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664822 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-sys-fs\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664859 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-sys-fs\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664868 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-system-cni-dir\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664895 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-etc-selinux\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664919 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-system-cni-dir\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664926 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/88319ece-75ee-4ddb-b42a-2a26963cba92-os-release\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664955 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2m42\" (UniqueName: \"kubernetes.io/projected/88319ece-75ee-4ddb-b42a-2a26963cba92-kube-api-access-w2m42\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.664503 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-host-var-lib-kubelet\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.665143 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-socket-dir\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.665151 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-device-dir\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.665268 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/88319ece-75ee-4ddb-b42a-2a26963cba92-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.665708 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:27.665312 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:27.666510 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.665319 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b2bcf95-19e8-4acb-8c03-e7b4322a90e1-host-slash\") pod \"iptables-alerter-5tn2l\" (UID: \"4b2bcf95-19e8-4acb-8c03-e7b4322a90e1\") " pod="openshift-network-operator/iptables-alerter-5tn2l" Apr 16 13:59:27.666510 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.665369 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-cnibin\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.666510 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.665389 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a859253-7bd8-487d-9ee8-7b85cb9cb528-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.666510 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:27.665404 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs podName:c449dabf-b9f5-4136-b598-074040f02629 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:28.165372318 +0000 UTC m=+3.173608556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs") pod "network-metrics-daemon-mkz26" (UID: "c449dabf-b9f5-4136-b598-074040f02629") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:27.666510 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.665445 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/88319ece-75ee-4ddb-b42a-2a26963cba92-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.666510 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.665580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-multus-daemon-config\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.666510 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.665632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d04a5bf9-7e36-4375-aad1-26af61c2c344-serviceca\") pod \"node-ca-w582s\" (UID: \"d04a5bf9-7e36-4375-aad1-26af61c2c344\") " pod="openshift-image-registry/node-ca-w582s" Apr 16 13:59:27.666510 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.665877 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/88319ece-75ee-4ddb-b42a-2a26963cba92-cni-binary-copy\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.674235 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.674203 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5brx\" (UniqueName: \"kubernetes.io/projected/7a859253-7bd8-487d-9ee8-7b85cb9cb528-kube-api-access-g5brx\") pod \"aws-ebs-csi-driver-node-nz5dj\" (UID: \"7a859253-7bd8-487d-9ee8-7b85cb9cb528\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.679955 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:27.679374 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:27.679955 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:27.679398 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:27.679955 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:27.679414 2569 projected.go:194] Error preparing data for projected volume kube-api-access-n7nh8 for pod openshift-network-diagnostics/network-check-target-ws5bp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:27.679955 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:27.679493 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8 podName:e4325a5a-3a6c-429b-a7f3-5a19918e6fd0 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:28.179474818 +0000 UTC m=+3.187711056 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-n7nh8" (UniqueName: "kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8") pod "network-check-target-ws5bp" (UID: "e4325a5a-3a6c-429b-a7f3-5a19918e6fd0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:27.679955 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.679561 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrgmz\" (UniqueName: \"kubernetes.io/projected/4b2bcf95-19e8-4acb-8c03-e7b4322a90e1-kube-api-access-rrgmz\") pod \"iptables-alerter-5tn2l\" (UID: \"4b2bcf95-19e8-4acb-8c03-e7b4322a90e1\") " pod="openshift-network-operator/iptables-alerter-5tn2l" Apr 16 13:59:27.679955 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.679578 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2m42\" (UniqueName: \"kubernetes.io/projected/88319ece-75ee-4ddb-b42a-2a26963cba92-kube-api-access-w2m42\") pod \"multus-additional-cni-plugins-ljdss\" (UID: \"88319ece-75ee-4ddb-b42a-2a26963cba92\") " pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.681732 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.681684 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm6nl\" (UniqueName: \"kubernetes.io/projected/c449dabf-b9f5-4136-b598-074040f02629-kube-api-access-nm6nl\") pod \"network-metrics-daemon-mkz26\" (UID: \"c449dabf-b9f5-4136-b598-074040f02629\") " pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:27.682222 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.682199 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjz8b\" (UniqueName: \"kubernetes.io/projected/ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f-kube-api-access-tjz8b\") pod \"multus-7wc6j\" (UID: \"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f\") " pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.682925 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.682883 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjwn8\" (UniqueName: \"kubernetes.io/projected/b248dbae-841e-4eb7-a41e-cc738673d882-kube-api-access-cjwn8\") pod \"node-resolver-4wsvh\" (UID: \"b248dbae-841e-4eb7-a41e-cc738673d882\") " pod="openshift-dns/node-resolver-4wsvh" Apr 16 13:59:27.683362 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.683345 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hld7b\" (UniqueName: \"kubernetes.io/projected/d04a5bf9-7e36-4375-aad1-26af61c2c344-kube-api-access-hld7b\") pod \"node-ca-w582s\" (UID: \"d04a5bf9-7e36-4375-aad1-26af61c2c344\") " pod="openshift-image-registry/node-ca-w582s" Apr 16 13:59:27.759155 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.758900 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" Apr 16 13:59:27.767806 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:27.767767 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod832755a8_0ca3_4291_86b7_728e462384ee.slice/crio-d9e67d60472709700779293250f35c7b645e3751f10ff881debb566496b9499a WatchSource:0}: Error finding container d9e67d60472709700779293250f35c7b645e3751f10ff881debb566496b9499a: Status 404 returned error can't find the container with id d9e67d60472709700779293250f35c7b645e3751f10ff881debb566496b9499a Apr 16 13:59:27.771437 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.771413 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:27.779022 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:27.778993 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dcff5a1_e62a_4c95_9278_292e6b914e02.slice/crio-8bb8291e698319472925f3f6951398bee845823c68726638d30bec8af8eec311 WatchSource:0}: Error finding container 8bb8291e698319472925f3f6951398bee845823c68726638d30bec8af8eec311: Status 404 returned error can't find the container with id 8bb8291e698319472925f3f6951398bee845823c68726638d30bec8af8eec311 Apr 16 13:59:27.780501 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.780365 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4wsvh" Apr 16 13:59:27.787171 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.787151 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9wmsd" Apr 16 13:59:27.788685 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:27.788651 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb248dbae_841e_4eb7_a41e_cc738673d882.slice/crio-e60c7d0d2766a43fc6220cc71d78655cde881823e34432d3a6cc1ed8d65159e6 WatchSource:0}: Error finding container e60c7d0d2766a43fc6220cc71d78655cde881823e34432d3a6cc1ed8d65159e6: Status 404 returned error can't find the container with id e60c7d0d2766a43fc6220cc71d78655cde881823e34432d3a6cc1ed8d65159e6 Apr 16 13:59:27.794509 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.794235 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5tn2l" Apr 16 13:59:27.798606 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:27.798576 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55dc093f_e774_41c5_a0c2_2eaa10a6e366.slice/crio-80bad7dab6dfc3629f6bbe483a440684bd5e4fc189541eef057dfc0af937d760 WatchSource:0}: Error finding container 80bad7dab6dfc3629f6bbe483a440684bd5e4fc189541eef057dfc0af937d760: Status 404 returned error can't find the container with id 80bad7dab6dfc3629f6bbe483a440684bd5e4fc189541eef057dfc0af937d760 Apr 16 13:59:27.803469 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.803448 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" Apr 16 13:59:27.804063 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:27.803948 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b2bcf95_19e8_4acb_8c03_e7b4322a90e1.slice/crio-3adc410520853eb291ca38defc144e384c591310c80e5e7279a462e22c0e2dbb WatchSource:0}: Error finding container 3adc410520853eb291ca38defc144e384c591310c80e5e7279a462e22c0e2dbb: Status 404 returned error can't find the container with id 3adc410520853eb291ca38defc144e384c591310c80e5e7279a462e22c0e2dbb Apr 16 13:59:27.811570 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.811547 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w582s" Apr 16 13:59:27.811853 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:27.811824 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a859253_7bd8_487d_9ee8_7b85cb9cb528.slice/crio-bb79489a5f3ddfdd79f43a1bcc511374272073e68337303a337256515041ecc1 WatchSource:0}: Error finding container bb79489a5f3ddfdd79f43a1bcc511374272073e68337303a337256515041ecc1: Status 404 returned error can't find the container with id bb79489a5f3ddfdd79f43a1bcc511374272073e68337303a337256515041ecc1 Apr 16 13:59:27.818490 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.818471 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ljdss" Apr 16 13:59:27.818759 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:27.818735 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd04a5bf9_7e36_4375_aad1_26af61c2c344.slice/crio-f5fbb0231f6832614670e8bccd931a03a57be68c6445d429436df70683902e4a WatchSource:0}: Error finding container f5fbb0231f6832614670e8bccd931a03a57be68c6445d429436df70683902e4a: Status 404 returned error can't find the container with id f5fbb0231f6832614670e8bccd931a03a57be68c6445d429436df70683902e4a Apr 16 13:59:27.826242 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:27.826219 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7wc6j" Apr 16 13:59:27.827202 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:27.827180 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88319ece_75ee_4ddb_b42a_2a26963cba92.slice/crio-c4676577edde531e69d1bebbf3bd9adc79fb33d9c3334e636bb5360e457f2498 WatchSource:0}: Error finding container c4676577edde531e69d1bebbf3bd9adc79fb33d9c3334e636bb5360e457f2498: Status 404 returned error can't find the container with id c4676577edde531e69d1bebbf3bd9adc79fb33d9c3334e636bb5360e457f2498 Apr 16 13:59:27.835598 ip-10-0-128-60 kubenswrapper[2569]: W0416 13:59:27.835570 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce8cf9fa_1ebb_4d35_ad03_c167ea484b2f.slice/crio-e99748639845526b325ba9ae66a5cab86548c06c4806ce1bf5486c8cb995255d WatchSource:0}: Error finding container e99748639845526b325ba9ae66a5cab86548c06c4806ce1bf5486c8cb995255d: Status 404 returned error can't find the container with id e99748639845526b325ba9ae66a5cab86548c06c4806ce1bf5486c8cb995255d Apr 16 13:59:28.168411 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:28.168340 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs\") pod \"network-metrics-daemon-mkz26\" (UID: \"c449dabf-b9f5-4136-b598-074040f02629\") " pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:28.168548 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:28.168453 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:28.168548 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:28.168503 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs podName:c449dabf-b9f5-4136-b598-074040f02629 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:29.168489909 +0000 UTC m=+4.176726142 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs") pod "network-metrics-daemon-mkz26" (UID: "c449dabf-b9f5-4136-b598-074040f02629") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:28.269272 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:28.269230 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7nh8\" (UniqueName: \"kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8\") pod \"network-check-target-ws5bp\" (UID: \"e4325a5a-3a6c-429b-a7f3-5a19918e6fd0\") " pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:28.269438 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:28.269384 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:28.269438 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:28.269408 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:28.269438 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:28.269422 2569 projected.go:194] Error preparing data for projected volume kube-api-access-n7nh8 for pod openshift-network-diagnostics/network-check-target-ws5bp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:28.269587 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:28.269476 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8 podName:e4325a5a-3a6c-429b-a7f3-5a19918e6fd0 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:29.269461321 +0000 UTC m=+4.277697554 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7nh8" (UniqueName: "kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8") pod "network-check-target-ws5bp" (UID: "e4325a5a-3a6c-429b-a7f3-5a19918e6fd0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:28.489419 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:28.489325 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:26 +0000 UTC" deadline="2028-01-16 00:24:43.071149146 +0000 UTC" Apr 16 13:59:28.489419 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:28.489362 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15346h25m14.581790687s" Apr 16 13:59:28.599174 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:28.599135 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:28.599372 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:28.599290 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:28.606576 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:28.606540 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7wc6j" event={"ID":"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f","Type":"ContainerStarted","Data":"e99748639845526b325ba9ae66a5cab86548c06c4806ce1bf5486c8cb995255d"} Apr 16 13:59:28.607738 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:28.607694 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljdss" event={"ID":"88319ece-75ee-4ddb-b42a-2a26963cba92","Type":"ContainerStarted","Data":"c4676577edde531e69d1bebbf3bd9adc79fb33d9c3334e636bb5360e457f2498"} Apr 16 13:59:28.608898 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:28.608859 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w582s" event={"ID":"d04a5bf9-7e36-4375-aad1-26af61c2c344","Type":"ContainerStarted","Data":"f5fbb0231f6832614670e8bccd931a03a57be68c6445d429436df70683902e4a"} Apr 16 13:59:28.609888 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:28.609862 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5tn2l" event={"ID":"4b2bcf95-19e8-4acb-8c03-e7b4322a90e1","Type":"ContainerStarted","Data":"3adc410520853eb291ca38defc144e384c591310c80e5e7279a462e22c0e2dbb"} Apr 16 13:59:28.611180 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:28.611153 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9wmsd" event={"ID":"55dc093f-e774-41c5-a0c2-2eaa10a6e366","Type":"ContainerStarted","Data":"80bad7dab6dfc3629f6bbe483a440684bd5e4fc189541eef057dfc0af937d760"} Apr 16 13:59:28.612468 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:28.612429 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" event={"ID":"832755a8-0ca3-4291-86b7-728e462384ee","Type":"ContainerStarted","Data":"d9e67d60472709700779293250f35c7b645e3751f10ff881debb566496b9499a"} Apr 16 13:59:28.613701 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:28.613673 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" event={"ID":"7a859253-7bd8-487d-9ee8-7b85cb9cb528","Type":"ContainerStarted","Data":"bb79489a5f3ddfdd79f43a1bcc511374272073e68337303a337256515041ecc1"} Apr 16 13:59:28.614773 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:28.614749 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4wsvh" event={"ID":"b248dbae-841e-4eb7-a41e-cc738673d882","Type":"ContainerStarted","Data":"e60c7d0d2766a43fc6220cc71d78655cde881823e34432d3a6cc1ed8d65159e6"} Apr 16 13:59:28.615940 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:28.615914 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" event={"ID":"6dcff5a1-e62a-4c95-9278-292e6b914e02","Type":"ContainerStarted","Data":"8bb8291e698319472925f3f6951398bee845823c68726638d30bec8af8eec311"} Apr 16 13:59:29.071896 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:29.071570 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:29.176147 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:29.176109 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs\") pod \"network-metrics-daemon-mkz26\" (UID: \"c449dabf-b9f5-4136-b598-074040f02629\") " pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:29.176327 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:29.176291 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:29.176384 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:29.176361 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs podName:c449dabf-b9f5-4136-b598-074040f02629 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:31.176342957 +0000 UTC m=+6.184579193 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs") pod "network-metrics-daemon-mkz26" (UID: "c449dabf-b9f5-4136-b598-074040f02629") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:29.278307 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:29.277584 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7nh8\" (UniqueName: \"kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8\") pod \"network-check-target-ws5bp\" (UID: \"e4325a5a-3a6c-429b-a7f3-5a19918e6fd0\") " pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:29.278307 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:29.277790 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:29.278307 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:29.277810 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:29.278307 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:29.277824 2569 projected.go:194] Error preparing data for projected volume kube-api-access-n7nh8 for pod openshift-network-diagnostics/network-check-target-ws5bp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:29.278307 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:29.277884 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8 podName:e4325a5a-3a6c-429b-a7f3-5a19918e6fd0 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:31.277866191 +0000 UTC m=+6.286102429 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7nh8" (UniqueName: "kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8") pod "network-check-target-ws5bp" (UID: "e4325a5a-3a6c-429b-a7f3-5a19918e6fd0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:29.601564 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:29.599411 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:29.601564 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:29.599560 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:29.632027 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:29.631336 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-60.ec2.internal" event={"ID":"0046dd3af4f2ad1568ea51124d053499","Type":"ContainerStarted","Data":"31713857d45dc82062a569c6804ac18e5b9343c3875a71f6d3e509848cb7281c"} Apr 16 13:59:29.650806 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:29.650722 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-60.ec2.internal" podStartSLOduration=3.650702169 podStartE2EDuration="3.650702169s" podCreationTimestamp="2026-04-16 13:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:29.649773495 +0000 UTC m=+4.658009752" watchObservedRunningTime="2026-04-16 13:59:29.650702169 +0000 UTC m=+4.658938429" Apr 16 13:59:30.598947 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:30.598911 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:30.599139 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:30.599051 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:30.644757 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:30.644718 2569 generic.go:358] "Generic (PLEG): container finished" podID="98a568f4042cac40af0eea82d9e64973" containerID="8acf3d7fdf6d8ef17ebd1673600e8f3aec8c02407b674def6b94b0261d4a2e80" exitCode=0 Apr 16 13:59:30.645337 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:30.645300 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal" event={"ID":"98a568f4042cac40af0eea82d9e64973","Type":"ContainerDied","Data":"8acf3d7fdf6d8ef17ebd1673600e8f3aec8c02407b674def6b94b0261d4a2e80"} Apr 16 13:59:31.195739 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:31.195694 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs\") pod \"network-metrics-daemon-mkz26\" (UID: \"c449dabf-b9f5-4136-b598-074040f02629\") " pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:31.195945 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:31.195880 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:31.196015 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:31.195948 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs podName:c449dabf-b9f5-4136-b598-074040f02629 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:35.195928689 +0000 UTC m=+10.204164924 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs") pod "network-metrics-daemon-mkz26" (UID: "c449dabf-b9f5-4136-b598-074040f02629") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:31.296956 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:31.296916 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7nh8\" (UniqueName: \"kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8\") pod \"network-check-target-ws5bp\" (UID: \"e4325a5a-3a6c-429b-a7f3-5a19918e6fd0\") " pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:31.297150 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:31.297113 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:31.297150 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:31.297135 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:31.297150 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:31.297148 2569 projected.go:194] Error preparing data for projected volume kube-api-access-n7nh8 for pod openshift-network-diagnostics/network-check-target-ws5bp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:31.297350 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:31.297333 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8 podName:e4325a5a-3a6c-429b-a7f3-5a19918e6fd0 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:35.297192145 +0000 UTC m=+10.305428397 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7nh8" (UniqueName: "kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8") pod "network-check-target-ws5bp" (UID: "e4325a5a-3a6c-429b-a7f3-5a19918e6fd0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:31.599737 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:31.599138 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:31.599737 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:31.599343 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:32.599323 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:32.598769 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:32.599323 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:32.598900 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:33.599444 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:33.599410 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:33.599918 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:33.599553 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:34.555194 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:34.555158 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6dbn9"] Apr 16 13:59:34.559313 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:34.559238 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:34.559448 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:34.559338 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6dbn9" podUID="087e7b97-349b-4c1c-a604-82fcaaa88534" Apr 16 13:59:34.599340 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:34.599300 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:34.599510 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:34.599423 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:34.627320 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:34.627082 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/087e7b97-349b-4c1c-a604-82fcaaa88534-dbus\") pod \"global-pull-secret-syncer-6dbn9\" (UID: \"087e7b97-349b-4c1c-a604-82fcaaa88534\") " pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:34.627320 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:34.627136 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret\") pod \"global-pull-secret-syncer-6dbn9\" (UID: \"087e7b97-349b-4c1c-a604-82fcaaa88534\") " pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:34.627320 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:34.627231 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/087e7b97-349b-4c1c-a604-82fcaaa88534-kubelet-config\") pod \"global-pull-secret-syncer-6dbn9\" (UID: \"087e7b97-349b-4c1c-a604-82fcaaa88534\") " pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:34.727874 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:34.727759 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/087e7b97-349b-4c1c-a604-82fcaaa88534-dbus\") pod \"global-pull-secret-syncer-6dbn9\" (UID: \"087e7b97-349b-4c1c-a604-82fcaaa88534\") " pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:34.727874 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:34.727810 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret\") pod \"global-pull-secret-syncer-6dbn9\" (UID: \"087e7b97-349b-4c1c-a604-82fcaaa88534\") " pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:34.728196 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:34.727891 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/087e7b97-349b-4c1c-a604-82fcaaa88534-kubelet-config\") pod \"global-pull-secret-syncer-6dbn9\" (UID: \"087e7b97-349b-4c1c-a604-82fcaaa88534\") " pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:34.728196 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:34.728006 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/087e7b97-349b-4c1c-a604-82fcaaa88534-kubelet-config\") pod \"global-pull-secret-syncer-6dbn9\" (UID: \"087e7b97-349b-4c1c-a604-82fcaaa88534\") " pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:34.728196 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:34.728014 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/087e7b97-349b-4c1c-a604-82fcaaa88534-dbus\") pod \"global-pull-secret-syncer-6dbn9\" (UID: \"087e7b97-349b-4c1c-a604-82fcaaa88534\") " pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:34.728196 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:34.728117 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:34.728196 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:34.728175 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret podName:087e7b97-349b-4c1c-a604-82fcaaa88534 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:35.228155611 +0000 UTC m=+10.236391851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret") pod "global-pull-secret-syncer-6dbn9" (UID: "087e7b97-349b-4c1c-a604-82fcaaa88534") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:35.231957 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:35.231919 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs\") pod \"network-metrics-daemon-mkz26\" (UID: \"c449dabf-b9f5-4136-b598-074040f02629\") " pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:35.232139 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:35.232013 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret\") pod \"global-pull-secret-syncer-6dbn9\" (UID: \"087e7b97-349b-4c1c-a604-82fcaaa88534\") " pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:35.232210 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:35.232138 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:35.232210 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:35.232198 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret podName:087e7b97-349b-4c1c-a604-82fcaaa88534 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:36.232179819 +0000 UTC m=+11.240416058 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret") pod "global-pull-secret-syncer-6dbn9" (UID: "087e7b97-349b-4c1c-a604-82fcaaa88534") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:35.232445 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:35.232326 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:35.232445 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:35.232378 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs podName:c449dabf-b9f5-4136-b598-074040f02629 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:43.232365517 +0000 UTC m=+18.240601749 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs") pod "network-metrics-daemon-mkz26" (UID: "c449dabf-b9f5-4136-b598-074040f02629") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:35.332702 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:35.332663 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7nh8\" (UniqueName: \"kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8\") pod \"network-check-target-ws5bp\" (UID: \"e4325a5a-3a6c-429b-a7f3-5a19918e6fd0\") " pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:35.332864 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:35.332822 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:35.332864 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:35.332841 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:35.332864 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:35.332852 2569 projected.go:194] Error preparing data for projected volume kube-api-access-n7nh8 for pod openshift-network-diagnostics/network-check-target-ws5bp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:35.333033 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:35.332900 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8 podName:e4325a5a-3a6c-429b-a7f3-5a19918e6fd0 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:43.332887232 +0000 UTC m=+18.341123464 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7nh8" (UniqueName: "kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8") pod "network-check-target-ws5bp" (UID: "e4325a5a-3a6c-429b-a7f3-5a19918e6fd0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:35.599222 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:35.599143 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:35.599395 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:35.599275 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:36.239559 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:36.239520 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret\") pod \"global-pull-secret-syncer-6dbn9\" (UID: \"087e7b97-349b-4c1c-a604-82fcaaa88534\") " pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:36.240057 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:36.239673 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:36.240057 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:36.239746 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret podName:087e7b97-349b-4c1c-a604-82fcaaa88534 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.23972595 +0000 UTC m=+13.247962185 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret") pod "global-pull-secret-syncer-6dbn9" (UID: "087e7b97-349b-4c1c-a604-82fcaaa88534") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:36.598957 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:36.598877 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:36.599110 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:36.598887 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:36.599110 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:36.599032 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:36.599218 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:36.599120 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6dbn9" podUID="087e7b97-349b-4c1c-a604-82fcaaa88534" Apr 16 13:59:37.598572 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:37.598540 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:37.598958 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:37.598654 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:38.254676 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:38.254607 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret\") pod \"global-pull-secret-syncer-6dbn9\" (UID: \"087e7b97-349b-4c1c-a604-82fcaaa88534\") " pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:38.254868 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:38.254770 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:38.254868 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:38.254854 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret podName:087e7b97-349b-4c1c-a604-82fcaaa88534 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:42.254829503 +0000 UTC m=+17.263065741 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret") pod "global-pull-secret-syncer-6dbn9" (UID: "087e7b97-349b-4c1c-a604-82fcaaa88534") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:38.599338 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:38.599239 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:38.599770 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:38.599240 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:38.599770 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:38.599367 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6dbn9" podUID="087e7b97-349b-4c1c-a604-82fcaaa88534" Apr 16 13:59:38.599770 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:38.599472 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:39.599232 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:39.599190 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:39.599417 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:39.599330 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:40.599119 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:40.599078 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:40.599295 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:40.599078 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:40.599295 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:40.599207 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:40.599395 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:40.599298 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6dbn9" podUID="087e7b97-349b-4c1c-a604-82fcaaa88534" Apr 16 13:59:41.598819 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:41.598743 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:41.599207 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:41.598928 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:42.285844 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:42.285808 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret\") pod \"global-pull-secret-syncer-6dbn9\" (UID: \"087e7b97-349b-4c1c-a604-82fcaaa88534\") " pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:42.286085 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:42.285984 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:42.286085 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:42.286065 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret podName:087e7b97-349b-4c1c-a604-82fcaaa88534 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:50.286041952 +0000 UTC m=+25.294278186 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret") pod "global-pull-secret-syncer-6dbn9" (UID: "087e7b97-349b-4c1c-a604-82fcaaa88534") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:42.598697 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:42.598609 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:42.598874 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:42.598744 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6dbn9" podUID="087e7b97-349b-4c1c-a604-82fcaaa88534" Apr 16 13:59:42.598874 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:42.598786 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:42.598874 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:42.598854 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:43.291358 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:43.291323 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs\") pod \"network-metrics-daemon-mkz26\" (UID: \"c449dabf-b9f5-4136-b598-074040f02629\") " pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:43.291536 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:43.291497 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:43.291589 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:43.291570 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs podName:c449dabf-b9f5-4136-b598-074040f02629 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:59.291550019 +0000 UTC m=+34.299786260 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs") pod "network-metrics-daemon-mkz26" (UID: "c449dabf-b9f5-4136-b598-074040f02629") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:43.392622 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:43.392585 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7nh8\" (UniqueName: \"kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8\") pod \"network-check-target-ws5bp\" (UID: \"e4325a5a-3a6c-429b-a7f3-5a19918e6fd0\") " pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:43.392806 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:43.392739 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:43.392806 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:43.392755 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:43.392806 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:43.392764 2569 projected.go:194] Error preparing data for projected volume kube-api-access-n7nh8 for pod openshift-network-diagnostics/network-check-target-ws5bp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:43.392914 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:43.392822 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8 podName:e4325a5a-3a6c-429b-a7f3-5a19918e6fd0 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:59.392805033 +0000 UTC m=+34.401041267 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7nh8" (UniqueName: "kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8") pod "network-check-target-ws5bp" (UID: "e4325a5a-3a6c-429b-a7f3-5a19918e6fd0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:43.599365 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:43.599261 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:43.599737 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:43.599393 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:44.598693 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:44.598657 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:44.598867 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:44.598784 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:44.598867 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:44.598844 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:44.598999 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:44.598946 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6dbn9" podUID="087e7b97-349b-4c1c-a604-82fcaaa88534" Apr 16 13:59:45.600756 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:45.599395 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:45.600756 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:45.599529 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:46.598824 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.598513 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:46.598971 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.598513 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:46.598971 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:46.598930 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:46.599085 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:46.599000 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6dbn9" podUID="087e7b97-349b-4c1c-a604-82fcaaa88534" Apr 16 13:59:46.692685 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.692630 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal" event={"ID":"98a568f4042cac40af0eea82d9e64973","Type":"ContainerStarted","Data":"2c65c16c1d5f4bb910564b75087137b3058d8ff0840cd588dc038016b19476fc"} Apr 16 13:59:46.694126 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.694102 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" event={"ID":"7a859253-7bd8-487d-9ee8-7b85cb9cb528","Type":"ContainerStarted","Data":"a89f139d9af916e4df9b8aab447cf0b0ea7ef15eb42c1e50f404ff9e70f91f26"} Apr 16 13:59:46.695684 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.695656 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4wsvh" event={"ID":"b248dbae-841e-4eb7-a41e-cc738673d882","Type":"ContainerStarted","Data":"e098943eb89dee83f30a12ffe85b05568c3dfd9f7b086e31e6c8cffdf00f9f40"} Apr 16 13:59:46.698572 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.698547 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" event={"ID":"6dcff5a1-e62a-4c95-9278-292e6b914e02","Type":"ContainerStarted","Data":"6a55c2e5f6ccecc0c7ac19ecc5546593b6e832c7dd7c03d9f30c8e5aec7f1a3f"} Apr 16 13:59:46.698667 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.698575 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" event={"ID":"6dcff5a1-e62a-4c95-9278-292e6b914e02","Type":"ContainerStarted","Data":"bb45ac09ae4090bfc126afd501d16aa49a5c0cc7ad4731e37c9e52f063da3199"} Apr 16 13:59:46.698667 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.698588 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" event={"ID":"6dcff5a1-e62a-4c95-9278-292e6b914e02","Type":"ContainerStarted","Data":"16ba69df3a4d379c628d588822c20eedae14d8b646d0d9fc50aab9c2e5a5d93e"} Apr 16 13:59:46.698667 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.698601 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" event={"ID":"6dcff5a1-e62a-4c95-9278-292e6b914e02","Type":"ContainerStarted","Data":"20b6b01469c12019fd8583be5e2eb990600fe8ecdac8b13d6e18d47c06864f8e"} Apr 16 13:59:46.698667 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.698613 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" event={"ID":"6dcff5a1-e62a-4c95-9278-292e6b914e02","Type":"ContainerStarted","Data":"ea44558a2a98b4487fbf45bde2a153bf60e8104e8fc6e81d460ffcd5d6ceeb24"} Apr 16 13:59:46.698667 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.698624 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" event={"ID":"6dcff5a1-e62a-4c95-9278-292e6b914e02","Type":"ContainerStarted","Data":"ea4d1196a3673252e67e3be4738e3b1de0b9a2e08252878d5b01bf03dc96e595"} Apr 16 13:59:46.699920 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.699889 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7wc6j" event={"ID":"ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f","Type":"ContainerStarted","Data":"efe19eb0a5283467d38651dd389deb96b6e0dafc0d3cd1f893b14aa437292417"} Apr 16 13:59:46.701508 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.701477 2569 generic.go:358] "Generic (PLEG): container finished" podID="88319ece-75ee-4ddb-b42a-2a26963cba92" containerID="7fae5748a5b0c556cfd43b1587f950b5203953127e5fece1d6fbab7a6fc0889d" exitCode=0 Apr 16 13:59:46.701600 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.701567 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljdss" event={"ID":"88319ece-75ee-4ddb-b42a-2a26963cba92","Type":"ContainerDied","Data":"7fae5748a5b0c556cfd43b1587f950b5203953127e5fece1d6fbab7a6fc0889d"} Apr 16 13:59:46.703281 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.703131 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w582s" event={"ID":"d04a5bf9-7e36-4375-aad1-26af61c2c344","Type":"ContainerStarted","Data":"9d05ba2eb8c77cc586c2691fc14abd1015086dd61fc4626d7dcea47f79c2f2f6"} Apr 16 13:59:46.704608 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.704566 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9wmsd" event={"ID":"55dc093f-e774-41c5-a0c2-2eaa10a6e366","Type":"ContainerStarted","Data":"7b3b2d3e5078ca4c92765a1c56d42f3fcf3642b7bb00dbe2212fa3992eac40b0"} Apr 16 13:59:46.705976 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.705952 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" event={"ID":"832755a8-0ca3-4291-86b7-728e462384ee","Type":"ContainerStarted","Data":"7f58e330b3b901c19649e5f2f0235c2db3502188dbf982bee87d440ff30570ea"} Apr 16 13:59:46.725397 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.725358 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-w582s" podStartSLOduration=4.050089305 podStartE2EDuration="21.725343731s" podCreationTimestamp="2026-04-16 13:59:25 +0000 UTC" firstStartedPulling="2026-04-16 13:59:27.820540151 +0000 UTC m=+2.828776390" lastFinishedPulling="2026-04-16 13:59:45.49579458 +0000 UTC m=+20.504030816" observedRunningTime="2026-04-16 13:59:46.72524049 +0000 UTC m=+21.733476742" watchObservedRunningTime="2026-04-16 13:59:46.725343731 +0000 UTC m=+21.733580047" Apr 16 13:59:46.725502 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.725428 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-60.ec2.internal" podStartSLOduration=20.725423869 podStartE2EDuration="20.725423869s" podCreationTimestamp="2026-04-16 13:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:46.70999302 +0000 UTC m=+21.718229314" watchObservedRunningTime="2026-04-16 13:59:46.725423869 +0000 UTC m=+21.733660124" Apr 16 13:59:46.743642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.743604 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7wc6j" podStartSLOduration=3.954187868 podStartE2EDuration="21.743590912s" podCreationTimestamp="2026-04-16 13:59:25 +0000 UTC" firstStartedPulling="2026-04-16 13:59:27.837516965 +0000 UTC m=+2.845753203" lastFinishedPulling="2026-04-16 13:59:45.626920007 +0000 UTC m=+20.635156247" observedRunningTime="2026-04-16 13:59:46.743289687 +0000 UTC m=+21.751525941" watchObservedRunningTime="2026-04-16 13:59:46.743590912 +0000 UTC m=+21.751827166" Apr 16 13:59:46.760642 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.760592 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hlkp4" podStartSLOduration=3.997569917 podStartE2EDuration="21.760577434s" podCreationTimestamp="2026-04-16 13:59:25 +0000 UTC" firstStartedPulling="2026-04-16 13:59:27.769882751 +0000 UTC m=+2.778118989" lastFinishedPulling="2026-04-16 13:59:45.532890265 +0000 UTC m=+20.541126506" observedRunningTime="2026-04-16 13:59:46.760370653 +0000 UTC m=+21.768606910" watchObservedRunningTime="2026-04-16 13:59:46.760577434 +0000 UTC m=+21.768813690" Apr 16 13:59:46.847122 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.847078 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4wsvh" podStartSLOduration=4.145243091 podStartE2EDuration="21.847062856s" podCreationTimestamp="2026-04-16 13:59:25 +0000 UTC" firstStartedPulling="2026-04-16 13:59:27.793914417 +0000 UTC m=+2.802150650" lastFinishedPulling="2026-04-16 13:59:45.495734182 +0000 UTC m=+20.503970415" observedRunningTime="2026-04-16 13:59:46.831110552 +0000 UTC m=+21.839346806" watchObservedRunningTime="2026-04-16 13:59:46.847062856 +0000 UTC m=+21.855299110" Apr 16 13:59:46.867522 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:46.867498 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:59:47.515441 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:47.515325 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:59:46.8675179Z","UUID":"8617bbd4-54e9-4268-a469-c5386b47bfa4","Handler":null,"Name":"","Endpoint":""} Apr 16 13:59:47.517784 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:47.517760 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:59:47.517784 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:47.517791 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:59:47.598457 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:47.598411 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:47.598625 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:47.598566 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:47.646913 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:47.646872 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9wmsd" Apr 16 13:59:47.647697 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:47.647679 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9wmsd" Apr 16 13:59:47.665063 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:47.665003 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9wmsd" podStartSLOduration=4.964281502 podStartE2EDuration="22.664985285s" podCreationTimestamp="2026-04-16 13:59:25 +0000 UTC" firstStartedPulling="2026-04-16 13:59:27.80068955 +0000 UTC m=+2.808925787" lastFinishedPulling="2026-04-16 13:59:45.501393333 +0000 UTC m=+20.509629570" observedRunningTime="2026-04-16 13:59:46.847479461 +0000 UTC m=+21.855715727" watchObservedRunningTime="2026-04-16 13:59:47.664985285 +0000 UTC m=+22.673221541" Apr 16 13:59:47.709404 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:47.709368 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5tn2l" event={"ID":"4b2bcf95-19e8-4acb-8c03-e7b4322a90e1","Type":"ContainerStarted","Data":"b310ec68a37d79fc40714658606c2efc8534e97efe4bc5653dc4f7ca13d82c42"} Apr 16 13:59:47.711514 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:47.711468 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" event={"ID":"7a859253-7bd8-487d-9ee8-7b85cb9cb528","Type":"ContainerStarted","Data":"ca2971122d3155946684ebef2682691c6809e8ad0b68122ffd5d885dfa803f64"} Apr 16 13:59:47.712664 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:47.712641 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9wmsd" Apr 16 13:59:47.713184 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:47.713138 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9wmsd" Apr 16 13:59:47.733663 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:47.733607 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5tn2l" podStartSLOduration=4.997954812 podStartE2EDuration="22.73359265s" podCreationTimestamp="2026-04-16 13:59:25 +0000 UTC" firstStartedPulling="2026-04-16 13:59:27.807090548 +0000 UTC m=+2.815326978" lastFinishedPulling="2026-04-16 13:59:45.542728571 +0000 UTC m=+20.550964816" observedRunningTime="2026-04-16 13:59:47.73310138 +0000 UTC m=+22.741337659" watchObservedRunningTime="2026-04-16 13:59:47.73359265 +0000 UTC m=+22.741828907" Apr 16 13:59:48.599531 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:48.599292 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:48.599707 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:48.599292 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:48.599707 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:48.599638 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6dbn9" podUID="087e7b97-349b-4c1c-a604-82fcaaa88534" Apr 16 13:59:48.599707 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:48.599690 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:48.715554 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:48.715518 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" event={"ID":"7a859253-7bd8-487d-9ee8-7b85cb9cb528","Type":"ContainerStarted","Data":"ba10eee6cf023af48bc893a00ae63418ee218dd4179b51f39fd7cf645d8019ad"} Apr 16 13:59:48.718978 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:48.718903 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" event={"ID":"6dcff5a1-e62a-4c95-9278-292e6b914e02","Type":"ContainerStarted","Data":"f4230d70745023bd3c62b84e5b74fa945ee0b37deab91a1c8261153c90ce4ca5"} Apr 16 13:59:48.735414 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:48.735360 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nz5dj" podStartSLOduration=3.458209966 podStartE2EDuration="23.73534182s" podCreationTimestamp="2026-04-16 13:59:25 +0000 UTC" firstStartedPulling="2026-04-16 13:59:27.815115848 +0000 UTC m=+2.823352082" lastFinishedPulling="2026-04-16 13:59:48.0922477 +0000 UTC m=+23.100483936" observedRunningTime="2026-04-16 13:59:48.735076124 +0000 UTC m=+23.743312384" watchObservedRunningTime="2026-04-16 13:59:48.73534182 +0000 UTC m=+23.743578076" Apr 16 13:59:49.599395 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:49.599357 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:49.599603 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:49.599485 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:50.347013 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:50.346968 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret\") pod \"global-pull-secret-syncer-6dbn9\" (UID: \"087e7b97-349b-4c1c-a604-82fcaaa88534\") " pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:50.347563 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:50.347103 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:50.347563 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:50.347169 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret podName:087e7b97-349b-4c1c-a604-82fcaaa88534 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:06.34715192 +0000 UTC m=+41.355388153 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret") pod "global-pull-secret-syncer-6dbn9" (UID: "087e7b97-349b-4c1c-a604-82fcaaa88534") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:50.598445 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:50.598408 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:50.598635 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:50.598408 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:50.598635 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:50.598545 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:50.598750 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:50.598626 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6dbn9" podUID="087e7b97-349b-4c1c-a604-82fcaaa88534" Apr 16 13:59:51.599078 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:51.598898 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:51.599858 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:51.599180 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:51.726936 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:51.726900 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" event={"ID":"6dcff5a1-e62a-4c95-9278-292e6b914e02","Type":"ContainerStarted","Data":"74f42d8ea8b11088d4eb1ec433a89087cc3777fac1f1171ee8de9789eb9064f1"} Apr 16 13:59:51.727232 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:51.727195 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:51.727232 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:51.727224 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:51.728512 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:51.728488 2569 generic.go:358] "Generic (PLEG): container finished" podID="88319ece-75ee-4ddb-b42a-2a26963cba92" containerID="023e9420c54b9b1fb1a2b714823d06c8a15b95c264c2236c32b8b3cf56455cb4" exitCode=0 Apr 16 13:59:51.728612 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:51.728533 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljdss" event={"ID":"88319ece-75ee-4ddb-b42a-2a26963cba92","Type":"ContainerDied","Data":"023e9420c54b9b1fb1a2b714823d06c8a15b95c264c2236c32b8b3cf56455cb4"} Apr 16 13:59:51.741587 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:51.741566 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:51.758069 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:51.758031 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" podStartSLOduration=8.941984173 podStartE2EDuration="26.758019516s" podCreationTimestamp="2026-04-16 13:59:25 +0000 UTC" firstStartedPulling="2026-04-16 13:59:27.78204828 +0000 UTC m=+2.790284513" lastFinishedPulling="2026-04-16 13:59:45.598083622 +0000 UTC m=+20.606319856" observedRunningTime="2026-04-16 13:59:51.756162524 +0000 UTC m=+26.764398780" watchObservedRunningTime="2026-04-16 13:59:51.758019516 +0000 UTC m=+26.766255772" Apr 16 13:59:52.598800 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:52.598768 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:52.598800 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:52.598797 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:52.599052 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:52.598867 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6dbn9" podUID="087e7b97-349b-4c1c-a604-82fcaaa88534" Apr 16 13:59:52.599052 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:52.598925 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:52.730852 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:52.730816 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:52.744543 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:52.744511 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6dbn9"] Apr 16 13:59:52.744721 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:52.744655 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:52.744803 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:52.744780 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6dbn9" podUID="087e7b97-349b-4c1c-a604-82fcaaa88534" Apr 16 13:59:52.747459 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:52.747433 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ws5bp"] Apr 16 13:59:52.747592 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:52.747545 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:52.747673 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:52.747649 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:52.748156 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:52.748128 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mkz26"] Apr 16 13:59:52.748342 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:52.748323 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:52.748459 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:52.748437 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:52.749592 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:52.749570 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 13:59:53.733716 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:53.733494 2569 generic.go:358] "Generic (PLEG): container finished" podID="88319ece-75ee-4ddb-b42a-2a26963cba92" containerID="d0f8a253b3a9619e32ceee90e87378e51726a83680dfc04b40ad392706c6ce16" exitCode=0 Apr 16 13:59:53.733716 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:53.733575 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljdss" event={"ID":"88319ece-75ee-4ddb-b42a-2a26963cba92","Type":"ContainerDied","Data":"d0f8a253b3a9619e32ceee90e87378e51726a83680dfc04b40ad392706c6ce16"} Apr 16 13:59:54.598438 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:54.598407 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:54.598595 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:54.598529 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6dbn9" podUID="087e7b97-349b-4c1c-a604-82fcaaa88534" Apr 16 13:59:54.598737 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:54.598712 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:54.598849 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:54.598763 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:54.598849 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:54.598839 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:54.598939 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:54.598911 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:54.737724 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:54.737645 2569 generic.go:358] "Generic (PLEG): container finished" podID="88319ece-75ee-4ddb-b42a-2a26963cba92" containerID="86ffb0d9d0c9913d38b07b430cf8633a53692a5c58db2f0192c52587ed41267b" exitCode=0 Apr 16 13:59:54.738079 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:54.737728 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljdss" event={"ID":"88319ece-75ee-4ddb-b42a-2a26963cba92","Type":"ContainerDied","Data":"86ffb0d9d0c9913d38b07b430cf8633a53692a5c58db2f0192c52587ed41267b"} Apr 16 13:59:56.599040 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:56.599006 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:56.599716 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:56.599140 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:56.599716 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:56.599144 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6dbn9" podUID="087e7b97-349b-4c1c-a604-82fcaaa88534" Apr 16 13:59:56.599716 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:56.599290 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mkz26" podUID="c449dabf-b9f5-4136-b598-074040f02629" Apr 16 13:59:56.599716 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:56.599357 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:56.599716 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:56.599446 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ws5bp" podUID="e4325a5a-3a6c-429b-a7f3-5a19918e6fd0" Apr 16 13:59:58.269726 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.269687 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-60.ec2.internal" event="NodeReady" Apr 16 13:59:58.270201 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.269874 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 13:59:58.316178 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.316137 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xstwc"] Apr 16 13:59:58.328786 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.328755 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-q5pvc"] Apr 16 13:59:58.328955 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.328901 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xstwc" Apr 16 13:59:58.331844 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.331684 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 13:59:58.331844 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.331684 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 13:59:58.332822 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.332637 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wjgsd\"" Apr 16 13:59:58.340541 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.340518 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xstwc"] Apr 16 13:59:58.340658 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.340548 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q5pvc"] Apr 16 13:59:58.340715 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.340668 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q5pvc" Apr 16 13:59:58.343502 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.343171 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 13:59:58.343502 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.343240 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 13:59:58.343502 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.343347 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 13:59:58.343502 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.343408 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pt57w\"" Apr 16 13:59:58.414897 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.414860 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5fb3cee7-5cda-4d24-a176-260852fbda2c-tmp-dir\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 13:59:58.415085 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.414967 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfksr\" (UniqueName: \"kubernetes.io/projected/5fb3cee7-5cda-4d24-a176-260852fbda2c-kube-api-access-rfksr\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 13:59:58.415085 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.415011 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 13:59:58.415190 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.415084 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fb3cee7-5cda-4d24-a176-260852fbda2c-config-volume\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 13:59:58.516002 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.515961 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fb3cee7-5cda-4d24-a176-260852fbda2c-config-volume\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 13:59:58.516002 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.516012 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert\") pod \"ingress-canary-q5pvc\" (UID: \"7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393\") " pod="openshift-ingress-canary/ingress-canary-q5pvc" Apr 16 13:59:58.516270 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.516045 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frjj2\" (UniqueName: \"kubernetes.io/projected/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-kube-api-access-frjj2\") pod \"ingress-canary-q5pvc\" (UID: \"7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393\") " pod="openshift-ingress-canary/ingress-canary-q5pvc" Apr 16 13:59:58.516270 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.516109 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5fb3cee7-5cda-4d24-a176-260852fbda2c-tmp-dir\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 13:59:58.516270 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.516153 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rfksr\" (UniqueName: \"kubernetes.io/projected/5fb3cee7-5cda-4d24-a176-260852fbda2c-kube-api-access-rfksr\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 13:59:58.516270 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.516178 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 13:59:58.516472 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:58.516351 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:58.516472 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:58.516424 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls podName:5fb3cee7-5cda-4d24-a176-260852fbda2c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:59.016405129 +0000 UTC m=+34.024641376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls") pod "dns-default-xstwc" (UID: "5fb3cee7-5cda-4d24-a176-260852fbda2c") : secret "dns-default-metrics-tls" not found Apr 16 13:59:58.516583 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.516513 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5fb3cee7-5cda-4d24-a176-260852fbda2c-tmp-dir\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 13:59:58.516663 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.516644 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fb3cee7-5cda-4d24-a176-260852fbda2c-config-volume\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 13:59:58.528940 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.528911 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfksr\" (UniqueName: \"kubernetes.io/projected/5fb3cee7-5cda-4d24-a176-260852fbda2c-kube-api-access-rfksr\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 13:59:58.599134 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.599104 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:58.599345 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.599104 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 13:59:58.599423 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.599104 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:58.601973 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.601941 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:59:58.602093 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.602005 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:59:58.602093 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.602028 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p5bbb\"" Apr 16 13:59:58.602176 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.602102 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 13:59:58.602176 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.602146 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:59:58.602287 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.602273 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fjvc5\"" Apr 16 13:59:58.616774 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.616750 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert\") pod \"ingress-canary-q5pvc\" (UID: \"7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393\") " pod="openshift-ingress-canary/ingress-canary-q5pvc" Apr 16 13:59:58.616863 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.616800 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frjj2\" (UniqueName: \"kubernetes.io/projected/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-kube-api-access-frjj2\") pod \"ingress-canary-q5pvc\" (UID: \"7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393\") " pod="openshift-ingress-canary/ingress-canary-q5pvc" Apr 16 13:59:58.616913 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:58.616881 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:58.616959 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:58.616938 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert podName:7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:59.116920651 +0000 UTC m=+34.125156891 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert") pod "ingress-canary-q5pvc" (UID: "7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393") : secret "canary-serving-cert" not found Apr 16 13:59:58.625444 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:58.625426 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frjj2\" (UniqueName: \"kubernetes.io/projected/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-kube-api-access-frjj2\") pod \"ingress-canary-q5pvc\" (UID: \"7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393\") " pod="openshift-ingress-canary/ingress-canary-q5pvc" Apr 16 13:59:59.019842 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:59.019803 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 13:59:59.020030 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:59.019990 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:59.020084 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:59.020071 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls podName:5fb3cee7-5cda-4d24-a176-260852fbda2c nodeName:}" failed. No retries permitted until 2026-04-16 14:00:00.020048545 +0000 UTC m=+35.028284794 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls") pod "dns-default-xstwc" (UID: "5fb3cee7-5cda-4d24-a176-260852fbda2c") : secret "dns-default-metrics-tls" not found Apr 16 13:59:59.120971 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:59.120930 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert\") pod \"ingress-canary-q5pvc\" (UID: \"7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393\") " pod="openshift-ingress-canary/ingress-canary-q5pvc" Apr 16 13:59:59.121128 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:59.121069 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:59.121191 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:59.121145 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert podName:7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:00.121123982 +0000 UTC m=+35.129360239 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert") pod "ingress-canary-q5pvc" (UID: "7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393") : secret "canary-serving-cert" not found Apr 16 13:59:59.322264 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:59.322160 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs\") pod \"network-metrics-daemon-mkz26\" (UID: \"c449dabf-b9f5-4136-b598-074040f02629\") " pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 13:59:59.322816 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:59.322338 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 13:59:59.322816 ip-10-0-128-60 kubenswrapper[2569]: E0416 13:59:59.322409 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs podName:c449dabf-b9f5-4136-b598-074040f02629 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:31.32239275 +0000 UTC m=+66.330628987 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs") pod "network-metrics-daemon-mkz26" (UID: "c449dabf-b9f5-4136-b598-074040f02629") : secret "metrics-daemon-secret" not found Apr 16 13:59:59.423654 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:59.423613 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7nh8\" (UniqueName: \"kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8\") pod \"network-check-target-ws5bp\" (UID: \"e4325a5a-3a6c-429b-a7f3-5a19918e6fd0\") " pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:59.426505 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:59.426479 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7nh8\" (UniqueName: \"kubernetes.io/projected/e4325a5a-3a6c-429b-a7f3-5a19918e6fd0-kube-api-access-n7nh8\") pod \"network-check-target-ws5bp\" (UID: \"e4325a5a-3a6c-429b-a7f3-5a19918e6fd0\") " pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 13:59:59.524320 ip-10-0-128-60 kubenswrapper[2569]: I0416 13:59:59.524285 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 14:00:00.029068 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:00.029024 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 14:00:00.029272 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:00.029202 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:00.029332 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:00.029303 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls podName:5fb3cee7-5cda-4d24-a176-260852fbda2c nodeName:}" failed. No retries permitted until 2026-04-16 14:00:02.029280995 +0000 UTC m=+37.037517279 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls") pod "dns-default-xstwc" (UID: "5fb3cee7-5cda-4d24-a176-260852fbda2c") : secret "dns-default-metrics-tls" not found Apr 16 14:00:00.129488 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:00.129445 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert\") pod \"ingress-canary-q5pvc\" (UID: \"7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393\") " pod="openshift-ingress-canary/ingress-canary-q5pvc" Apr 16 14:00:00.129656 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:00.129635 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:00.129724 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:00.129704 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert podName:7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:02.129688064 +0000 UTC m=+37.137924296 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert") pod "ingress-canary-q5pvc" (UID: "7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393") : secret "canary-serving-cert" not found Apr 16 14:00:00.509760 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:00.509600 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ws5bp"] Apr 16 14:00:00.643934 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:00:00.643849 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4325a5a_3a6c_429b_a7f3_5a19918e6fd0.slice/crio-f3fdc0e7adbcb6ace79f35859776d87adcf8a4ab40fb8718d51a6b667e898a6d WatchSource:0}: Error finding container f3fdc0e7adbcb6ace79f35859776d87adcf8a4ab40fb8718d51a6b667e898a6d: Status 404 returned error can't find the container with id f3fdc0e7adbcb6ace79f35859776d87adcf8a4ab40fb8718d51a6b667e898a6d Apr 16 14:00:00.749161 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:00.749104 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ws5bp" event={"ID":"e4325a5a-3a6c-429b-a7f3-5a19918e6fd0","Type":"ContainerStarted","Data":"f3fdc0e7adbcb6ace79f35859776d87adcf8a4ab40fb8718d51a6b667e898a6d"} Apr 16 14:00:01.755325 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:01.755292 2569 generic.go:358] "Generic (PLEG): container finished" podID="88319ece-75ee-4ddb-b42a-2a26963cba92" containerID="07e8b52254a5b81255740904f0ddd44e9864218aa2d22029b1d760908507567a" exitCode=0 Apr 16 14:00:01.755784 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:01.755364 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljdss" event={"ID":"88319ece-75ee-4ddb-b42a-2a26963cba92","Type":"ContainerDied","Data":"07e8b52254a5b81255740904f0ddd44e9864218aa2d22029b1d760908507567a"} Apr 16 14:00:02.044050 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:02.043950 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 14:00:02.044225 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:02.044141 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:02.044323 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:02.044226 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls podName:5fb3cee7-5cda-4d24-a176-260852fbda2c nodeName:}" failed. No retries permitted until 2026-04-16 14:00:06.04420483 +0000 UTC m=+41.052441065 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls") pod "dns-default-xstwc" (UID: "5fb3cee7-5cda-4d24-a176-260852fbda2c") : secret "dns-default-metrics-tls" not found Apr 16 14:00:02.144646 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:02.144610 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert\") pod \"ingress-canary-q5pvc\" (UID: \"7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393\") " pod="openshift-ingress-canary/ingress-canary-q5pvc" Apr 16 14:00:02.144826 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:02.144780 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:02.144894 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:02.144856 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert podName:7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:06.144836033 +0000 UTC m=+41.153072267 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert") pod "ingress-canary-q5pvc" (UID: "7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393") : secret "canary-serving-cert" not found Apr 16 14:00:02.760817 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:02.760780 2569 generic.go:358] "Generic (PLEG): container finished" podID="88319ece-75ee-4ddb-b42a-2a26963cba92" containerID="ca48cd0b83c3a9a030c4bd368f382b22e226d5e8e83725a5f9717a604491b0b5" exitCode=0 Apr 16 14:00:02.761297 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:02.760851 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljdss" event={"ID":"88319ece-75ee-4ddb-b42a-2a26963cba92","Type":"ContainerDied","Data":"ca48cd0b83c3a9a030c4bd368f382b22e226d5e8e83725a5f9717a604491b0b5"} Apr 16 14:00:03.766981 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:03.766783 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljdss" event={"ID":"88319ece-75ee-4ddb-b42a-2a26963cba92","Type":"ContainerStarted","Data":"bc2b00780e41ff88d6b90695331efcff99f9bffb03cd509889878130fcc01e37"} Apr 16 14:00:03.795837 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:03.795778 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ljdss" podStartSLOduration=5.935357906 podStartE2EDuration="38.795759887s" podCreationTimestamp="2026-04-16 13:59:25 +0000 UTC" firstStartedPulling="2026-04-16 13:59:27.831428484 +0000 UTC m=+2.839664732" lastFinishedPulling="2026-04-16 14:00:00.691830456 +0000 UTC m=+35.700066713" observedRunningTime="2026-04-16 14:00:03.794203612 +0000 UTC m=+38.802439861" watchObservedRunningTime="2026-04-16 14:00:03.795759887 +0000 UTC m=+38.803996149" Apr 16 14:00:04.770097 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:04.769980 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ws5bp" event={"ID":"e4325a5a-3a6c-429b-a7f3-5a19918e6fd0","Type":"ContainerStarted","Data":"6412bb72c147733301133e9fdb79c9355ebafa4cc169ebe27195d449777ca5fd"} Apr 16 14:00:04.770569 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:04.770354 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 14:00:04.787685 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:04.787637 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ws5bp" podStartSLOduration=36.038893386 podStartE2EDuration="39.787622474s" podCreationTimestamp="2026-04-16 13:59:25 +0000 UTC" firstStartedPulling="2026-04-16 14:00:00.66920788 +0000 UTC m=+35.677444113" lastFinishedPulling="2026-04-16 14:00:04.417936967 +0000 UTC m=+39.426173201" observedRunningTime="2026-04-16 14:00:04.785992416 +0000 UTC m=+39.794228671" watchObservedRunningTime="2026-04-16 14:00:04.787622474 +0000 UTC m=+39.795858725" Apr 16 14:00:06.075821 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:06.075787 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 14:00:06.076286 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:06.075960 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:06.076286 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:06.076039 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls podName:5fb3cee7-5cda-4d24-a176-260852fbda2c nodeName:}" failed. No retries permitted until 2026-04-16 14:00:14.076015896 +0000 UTC m=+49.084252132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls") pod "dns-default-xstwc" (UID: "5fb3cee7-5cda-4d24-a176-260852fbda2c") : secret "dns-default-metrics-tls" not found Apr 16 14:00:06.176962 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:06.176920 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert\") pod \"ingress-canary-q5pvc\" (UID: \"7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393\") " pod="openshift-ingress-canary/ingress-canary-q5pvc" Apr 16 14:00:06.177090 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:06.177058 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:06.177143 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:06.177118 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert podName:7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:14.177101215 +0000 UTC m=+49.185337448 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert") pod "ingress-canary-q5pvc" (UID: "7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393") : secret "canary-serving-cert" not found Apr 16 14:00:06.378200 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:06.378102 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret\") pod \"global-pull-secret-syncer-6dbn9\" (UID: \"087e7b97-349b-4c1c-a604-82fcaaa88534\") " pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 14:00:06.382226 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:06.382199 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/087e7b97-349b-4c1c-a604-82fcaaa88534-original-pull-secret\") pod \"global-pull-secret-syncer-6dbn9\" (UID: \"087e7b97-349b-4c1c-a604-82fcaaa88534\") " pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 14:00:06.419302 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:06.419269 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6dbn9" Apr 16 14:00:06.532163 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:06.532122 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6dbn9"] Apr 16 14:00:06.535215 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:00:06.535178 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod087e7b97_349b_4c1c_a604_82fcaaa88534.slice/crio-f9f74506191c102e4d3e55a40d3625838e9a5adb1f6796922f3c90915a6056a4 WatchSource:0}: Error finding container f9f74506191c102e4d3e55a40d3625838e9a5adb1f6796922f3c90915a6056a4: Status 404 returned error can't find the container with id f9f74506191c102e4d3e55a40d3625838e9a5adb1f6796922f3c90915a6056a4 Apr 16 14:00:06.774722 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:06.774690 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6dbn9" event={"ID":"087e7b97-349b-4c1c-a604-82fcaaa88534","Type":"ContainerStarted","Data":"f9f74506191c102e4d3e55a40d3625838e9a5adb1f6796922f3c90915a6056a4"} Apr 16 14:00:11.785717 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:11.785680 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6dbn9" event={"ID":"087e7b97-349b-4c1c-a604-82fcaaa88534","Type":"ContainerStarted","Data":"bf0b441f3d1b7e1e24dc83f8150ca339799999c14156ec7ee79ea85dbdea4202"} Apr 16 14:00:11.801295 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:11.801224 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6dbn9" podStartSLOduration=33.307311933 podStartE2EDuration="37.801209744s" podCreationTimestamp="2026-04-16 13:59:34 +0000 UTC" firstStartedPulling="2026-04-16 14:00:06.536904148 +0000 UTC m=+41.545140381" lastFinishedPulling="2026-04-16 14:00:11.030801944 +0000 UTC m=+46.039038192" observedRunningTime="2026-04-16 14:00:11.800700027 +0000 UTC m=+46.808936286" watchObservedRunningTime="2026-04-16 14:00:11.801209744 +0000 UTC m=+46.809445998" Apr 16 14:00:14.133510 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:14.133471 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 14:00:14.133974 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:14.133650 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:14.133974 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:14.133738 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls podName:5fb3cee7-5cda-4d24-a176-260852fbda2c nodeName:}" failed. No retries permitted until 2026-04-16 14:00:30.133715552 +0000 UTC m=+65.141951789 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls") pod "dns-default-xstwc" (UID: "5fb3cee7-5cda-4d24-a176-260852fbda2c") : secret "dns-default-metrics-tls" not found Apr 16 14:00:14.234802 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:14.234766 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert\") pod \"ingress-canary-q5pvc\" (UID: \"7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393\") " pod="openshift-ingress-canary/ingress-canary-q5pvc" Apr 16 14:00:14.234965 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:14.234921 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:14.235005 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:14.234991 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert podName:7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:30.23497301 +0000 UTC m=+65.243209249 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert") pod "ingress-canary-q5pvc" (UID: "7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393") : secret "canary-serving-cert" not found Apr 16 14:00:24.748577 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:24.748542 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jdtzv" Apr 16 14:00:30.139113 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.139059 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 14:00:30.139607 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:30.139204 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:30.139607 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:30.139288 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls podName:5fb3cee7-5cda-4d24-a176-260852fbda2c nodeName:}" failed. No retries permitted until 2026-04-16 14:01:02.139271063 +0000 UTC m=+97.147507311 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls") pod "dns-default-xstwc" (UID: "5fb3cee7-5cda-4d24-a176-260852fbda2c") : secret "dns-default-metrics-tls" not found Apr 16 14:00:30.239717 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.239672 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert\") pod \"ingress-canary-q5pvc\" (UID: \"7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393\") " pod="openshift-ingress-canary/ingress-canary-q5pvc" Apr 16 14:00:30.239882 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:30.239813 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:30.239882 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:30.239867 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert podName:7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:02.239853844 +0000 UTC m=+97.248090081 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert") pod "ingress-canary-q5pvc" (UID: "7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393") : secret "canary-serving-cert" not found Apr 16 14:00:30.880308 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.880273 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-7vwhw"] Apr 16 14:00:30.884707 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.884684 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn"] Apr 16 14:00:30.884867 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.884847 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:30.887529 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.887511 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" Apr 16 14:00:30.888105 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.888084 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 14:00:30.888269 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.888235 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-6qnsd\"" Apr 16 14:00:30.889339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.889315 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 14:00:30.889444 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.889323 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 14:00:30.889444 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.889416 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:00:30.890939 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.890922 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 14:00:30.891048 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.890977 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 14:00:30.892121 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.891951 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:00:30.893238 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.893218 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-cz2qr\"" Apr 16 14:00:30.895800 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.895773 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 14:00:30.911385 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.911145 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-7vwhw"] Apr 16 14:00:30.912093 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.912071 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn"] Apr 16 14:00:30.944484 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.944457 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6305f8-dd82-4db8-91e9-4ddbc887813b-serving-cert\") pod \"console-operator-d87b8d5fc-7vwhw\" (UID: \"9f6305f8-dd82-4db8-91e9-4ddbc887813b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:30.944625 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.944512 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79w6f\" (UniqueName: \"kubernetes.io/projected/9f6305f8-dd82-4db8-91e9-4ddbc887813b-kube-api-access-79w6f\") pod \"console-operator-d87b8d5fc-7vwhw\" (UID: \"9f6305f8-dd82-4db8-91e9-4ddbc887813b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:30.944625 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.944540 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndv2k\" (UniqueName: \"kubernetes.io/projected/fa688607-67ff-421d-baa1-1faca7c66d27-kube-api-access-ndv2k\") pod \"cluster-samples-operator-667775844f-tvqcn\" (UID: \"fa688607-67ff-421d-baa1-1faca7c66d27\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" Apr 16 14:00:30.944625 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.944563 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6305f8-dd82-4db8-91e9-4ddbc887813b-config\") pod \"console-operator-d87b8d5fc-7vwhw\" (UID: \"9f6305f8-dd82-4db8-91e9-4ddbc887813b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:30.944625 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.944586 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f6305f8-dd82-4db8-91e9-4ddbc887813b-trusted-ca\") pod \"console-operator-d87b8d5fc-7vwhw\" (UID: \"9f6305f8-dd82-4db8-91e9-4ddbc887813b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:30.944787 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.944659 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-tvqcn\" (UID: \"fa688607-67ff-421d-baa1-1faca7c66d27\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" Apr 16 14:00:30.974805 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.974782 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5b8555f84-mbcn4"] Apr 16 14:00:30.977701 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.977685 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:30.980505 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.980485 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 14:00:30.980608 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.980577 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 14:00:30.980608 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.980597 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 14:00:30.980721 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.980702 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hmc2p\"" Apr 16 14:00:30.980776 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.980759 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 14:00:30.980889 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.980871 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 14:00:30.981146 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.981133 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 14:00:30.989660 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:30.989635 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5b8555f84-mbcn4"] Apr 16 14:00:31.045162 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.045127 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-tvqcn\" (UID: \"fa688607-67ff-421d-baa1-1faca7c66d27\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" Apr 16 14:00:31.045162 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.045167 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-stats-auth\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:31.045408 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.045196 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6305f8-dd82-4db8-91e9-4ddbc887813b-serving-cert\") pod \"console-operator-d87b8d5fc-7vwhw\" (UID: \"9f6305f8-dd82-4db8-91e9-4ddbc887813b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:31.045408 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:31.045282 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:00:31.045408 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.045293 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:31.045408 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.045319 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-default-certificate\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:31.045408 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:31.045348 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls podName:fa688607-67ff-421d-baa1-1faca7c66d27 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:31.545328232 +0000 UTC m=+66.553564466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls") pod "cluster-samples-operator-667775844f-tvqcn" (UID: "fa688607-67ff-421d-baa1-1faca7c66d27") : secret "samples-operator-tls" not found Apr 16 14:00:31.045654 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.045426 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79w6f\" (UniqueName: \"kubernetes.io/projected/9f6305f8-dd82-4db8-91e9-4ddbc887813b-kube-api-access-79w6f\") pod \"console-operator-d87b8d5fc-7vwhw\" (UID: \"9f6305f8-dd82-4db8-91e9-4ddbc887813b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:31.045654 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.045445 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhwz6\" (UniqueName: \"kubernetes.io/projected/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-kube-api-access-dhwz6\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:31.045654 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.045470 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndv2k\" (UniqueName: \"kubernetes.io/projected/fa688607-67ff-421d-baa1-1faca7c66d27-kube-api-access-ndv2k\") pod \"cluster-samples-operator-667775844f-tvqcn\" (UID: \"fa688607-67ff-421d-baa1-1faca7c66d27\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" Apr 16 14:00:31.045654 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.045493 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6305f8-dd82-4db8-91e9-4ddbc887813b-config\") pod \"console-operator-d87b8d5fc-7vwhw\" (UID: \"9f6305f8-dd82-4db8-91e9-4ddbc887813b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:31.045654 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.045530 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f6305f8-dd82-4db8-91e9-4ddbc887813b-trusted-ca\") pod \"console-operator-d87b8d5fc-7vwhw\" (UID: \"9f6305f8-dd82-4db8-91e9-4ddbc887813b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:31.045654 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.045576 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:31.047536 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.047517 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6305f8-dd82-4db8-91e9-4ddbc887813b-serving-cert\") pod \"console-operator-d87b8d5fc-7vwhw\" (UID: \"9f6305f8-dd82-4db8-91e9-4ddbc887813b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:31.056713 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.056691 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndv2k\" (UniqueName: \"kubernetes.io/projected/fa688607-67ff-421d-baa1-1faca7c66d27-kube-api-access-ndv2k\") pod \"cluster-samples-operator-667775844f-tvqcn\" (UID: \"fa688607-67ff-421d-baa1-1faca7c66d27\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" Apr 16 14:00:31.079697 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.079675 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6305f8-dd82-4db8-91e9-4ddbc887813b-config\") pod \"console-operator-d87b8d5fc-7vwhw\" (UID: \"9f6305f8-dd82-4db8-91e9-4ddbc887813b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:31.079951 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.079931 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f6305f8-dd82-4db8-91e9-4ddbc887813b-trusted-ca\") pod \"console-operator-d87b8d5fc-7vwhw\" (UID: \"9f6305f8-dd82-4db8-91e9-4ddbc887813b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:31.081509 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.081486 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79w6f\" (UniqueName: \"kubernetes.io/projected/9f6305f8-dd82-4db8-91e9-4ddbc887813b-kube-api-access-79w6f\") pod \"console-operator-d87b8d5fc-7vwhw\" (UID: \"9f6305f8-dd82-4db8-91e9-4ddbc887813b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:31.146581 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.146498 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-stats-auth\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:31.146581 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.146553 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:31.146581 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.146579 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-default-certificate\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:31.147104 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.146597 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhwz6\" (UniqueName: \"kubernetes.io/projected/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-kube-api-access-dhwz6\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:31.147104 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.146648 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:31.147104 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:31.146742 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:00:31.147104 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:31.146751 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle podName:e39140fd-a9c8-42bc-8af9-0ddd8cd8addc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:31.646732434 +0000 UTC m=+66.654968682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle") pod "router-default-5b8555f84-mbcn4" (UID: "e39140fd-a9c8-42bc-8af9-0ddd8cd8addc") : configmap references non-existent config key: service-ca.crt Apr 16 14:00:31.147104 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:31.146790 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs podName:e39140fd-a9c8-42bc-8af9-0ddd8cd8addc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:31.646777284 +0000 UTC m=+66.655013539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs") pod "router-default-5b8555f84-mbcn4" (UID: "e39140fd-a9c8-42bc-8af9-0ddd8cd8addc") : secret "router-metrics-certs-default" not found Apr 16 14:00:31.148992 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.148971 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-stats-auth\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:31.149374 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.149353 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-default-certificate\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:31.155740 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.155719 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhwz6\" (UniqueName: \"kubernetes.io/projected/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-kube-api-access-dhwz6\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:31.195634 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.195605 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:31.310771 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.310740 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-7vwhw"] Apr 16 14:00:31.313555 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:00:31.313532 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f6305f8_dd82_4db8_91e9_4ddbc887813b.slice/crio-b77dce8f685a432aec9b26f0d2e159b99062127134259d41777ef59a5db4eb68 WatchSource:0}: Error finding container b77dce8f685a432aec9b26f0d2e159b99062127134259d41777ef59a5db4eb68: Status 404 returned error can't find the container with id b77dce8f685a432aec9b26f0d2e159b99062127134259d41777ef59a5db4eb68 Apr 16 14:00:31.348150 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.348124 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs\") pod \"network-metrics-daemon-mkz26\" (UID: \"c449dabf-b9f5-4136-b598-074040f02629\") " pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 14:00:31.348306 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:31.348290 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:00:31.348357 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:31.348351 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs podName:c449dabf-b9f5-4136-b598-074040f02629 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:35.348336246 +0000 UTC m=+130.356572479 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs") pod "network-metrics-daemon-mkz26" (UID: "c449dabf-b9f5-4136-b598-074040f02629") : secret "metrics-daemon-secret" not found Apr 16 14:00:31.550116 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.550085 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-tvqcn\" (UID: \"fa688607-67ff-421d-baa1-1faca7c66d27\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" Apr 16 14:00:31.550305 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:31.550219 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:00:31.550305 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:31.550300 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls podName:fa688607-67ff-421d-baa1-1faca7c66d27 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:32.550283416 +0000 UTC m=+67.558519650 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls") pod "cluster-samples-operator-667775844f-tvqcn" (UID: "fa688607-67ff-421d-baa1-1faca7c66d27") : secret "samples-operator-tls" not found Apr 16 14:00:31.650954 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.650908 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:31.651124 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.651008 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:31.651124 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:31.651063 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:00:31.651124 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:31.651125 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs podName:e39140fd-a9c8-42bc-8af9-0ddd8cd8addc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:32.651110374 +0000 UTC m=+67.659346606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs") pod "router-default-5b8555f84-mbcn4" (UID: "e39140fd-a9c8-42bc-8af9-0ddd8cd8addc") : secret "router-metrics-certs-default" not found Apr 16 14:00:31.651231 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:31.651138 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle podName:e39140fd-a9c8-42bc-8af9-0ddd8cd8addc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:32.651133005 +0000 UTC m=+67.659369238 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle") pod "router-default-5b8555f84-mbcn4" (UID: "e39140fd-a9c8-42bc-8af9-0ddd8cd8addc") : configmap references non-existent config key: service-ca.crt Apr 16 14:00:31.826545 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:31.826454 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" event={"ID":"9f6305f8-dd82-4db8-91e9-4ddbc887813b","Type":"ContainerStarted","Data":"b77dce8f685a432aec9b26f0d2e159b99062127134259d41777ef59a5db4eb68"} Apr 16 14:00:32.558119 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.558082 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-tvqcn\" (UID: \"fa688607-67ff-421d-baa1-1faca7c66d27\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" Apr 16 14:00:32.558556 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:32.558221 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:00:32.558556 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:32.558307 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls podName:fa688607-67ff-421d-baa1-1faca7c66d27 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:34.558290318 +0000 UTC m=+69.566526551 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls") pod "cluster-samples-operator-667775844f-tvqcn" (UID: "fa688607-67ff-421d-baa1-1faca7c66d27") : secret "samples-operator-tls" not found Apr 16 14:00:32.659384 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.659345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:32.659491 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.659426 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:32.659527 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:32.659494 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:00:32.659559 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:32.659546 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle podName:e39140fd-a9c8-42bc-8af9-0ddd8cd8addc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:34.659526665 +0000 UTC m=+69.667762913 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle") pod "router-default-5b8555f84-mbcn4" (UID: "e39140fd-a9c8-42bc-8af9-0ddd8cd8addc") : configmap references non-existent config key: service-ca.crt Apr 16 14:00:32.659603 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:32.659562 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs podName:e39140fd-a9c8-42bc-8af9-0ddd8cd8addc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:34.659555426 +0000 UTC m=+69.667791659 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs") pod "router-default-5b8555f84-mbcn4" (UID: "e39140fd-a9c8-42bc-8af9-0ddd8cd8addc") : secret "router-metrics-certs-default" not found Apr 16 14:00:32.807560 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.807525 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9"] Apr 16 14:00:32.811823 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.811778 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" Apr 16 14:00:32.814671 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.814641 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 14:00:32.814802 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.814642 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:00:32.814802 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.814711 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 14:00:32.815833 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.815816 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-ztkbh\"" Apr 16 14:00:32.815910 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.815881 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 14:00:32.819913 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.819886 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9"] Apr 16 14:00:32.860482 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.860439 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21db3010-f35e-486a-9584-0dc09d164c21-config\") pod \"service-ca-operator-69965bb79d-xztn9\" (UID: \"21db3010-f35e-486a-9584-0dc09d164c21\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" Apr 16 14:00:32.860667 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.860498 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpckx\" (UniqueName: \"kubernetes.io/projected/21db3010-f35e-486a-9584-0dc09d164c21-kube-api-access-tpckx\") pod \"service-ca-operator-69965bb79d-xztn9\" (UID: \"21db3010-f35e-486a-9584-0dc09d164c21\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" Apr 16 14:00:32.860667 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.860588 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21db3010-f35e-486a-9584-0dc09d164c21-serving-cert\") pod \"service-ca-operator-69965bb79d-xztn9\" (UID: \"21db3010-f35e-486a-9584-0dc09d164c21\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" Apr 16 14:00:32.961730 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.961698 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21db3010-f35e-486a-9584-0dc09d164c21-serving-cert\") pod \"service-ca-operator-69965bb79d-xztn9\" (UID: \"21db3010-f35e-486a-9584-0dc09d164c21\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" Apr 16 14:00:32.961865 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.961761 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21db3010-f35e-486a-9584-0dc09d164c21-config\") pod \"service-ca-operator-69965bb79d-xztn9\" (UID: \"21db3010-f35e-486a-9584-0dc09d164c21\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" Apr 16 14:00:32.961944 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.961926 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpckx\" (UniqueName: \"kubernetes.io/projected/21db3010-f35e-486a-9584-0dc09d164c21-kube-api-access-tpckx\") pod \"service-ca-operator-69965bb79d-xztn9\" (UID: \"21db3010-f35e-486a-9584-0dc09d164c21\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" Apr 16 14:00:32.962235 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.962218 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21db3010-f35e-486a-9584-0dc09d164c21-config\") pod \"service-ca-operator-69965bb79d-xztn9\" (UID: \"21db3010-f35e-486a-9584-0dc09d164c21\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" Apr 16 14:00:32.964008 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.963988 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21db3010-f35e-486a-9584-0dc09d164c21-serving-cert\") pod \"service-ca-operator-69965bb79d-xztn9\" (UID: \"21db3010-f35e-486a-9584-0dc09d164c21\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" Apr 16 14:00:32.971144 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:32.971120 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpckx\" (UniqueName: \"kubernetes.io/projected/21db3010-f35e-486a-9584-0dc09d164c21-kube-api-access-tpckx\") pod \"service-ca-operator-69965bb79d-xztn9\" (UID: \"21db3010-f35e-486a-9584-0dc09d164c21\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" Apr 16 14:00:33.121047 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:33.120957 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" Apr 16 14:00:33.240289 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:33.240239 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9"] Apr 16 14:00:33.830743 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:33.830708 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" event={"ID":"21db3010-f35e-486a-9584-0dc09d164c21","Type":"ContainerStarted","Data":"996b3eaf478a18fe5cd4e8f1a5d7d526998d1f9aa72d3e74fe04987fccf04c6d"} Apr 16 14:00:34.302134 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:34.302099 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-mhlgp"] Apr 16 14:00:34.316760 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:34.316728 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-mhlgp"] Apr 16 14:00:34.316926 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:34.316870 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mhlgp" Apr 16 14:00:34.319907 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:34.319885 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-ljkxm\"" Apr 16 14:00:34.374495 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:34.374419 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dt9w\" (UniqueName: \"kubernetes.io/projected/9afaf590-634d-44c9-9149-a169bbbc6320-kube-api-access-5dt9w\") pod \"network-check-source-7b678d77c7-mhlgp\" (UID: \"9afaf590-634d-44c9-9149-a169bbbc6320\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mhlgp" Apr 16 14:00:34.475122 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:34.475067 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dt9w\" (UniqueName: \"kubernetes.io/projected/9afaf590-634d-44c9-9149-a169bbbc6320-kube-api-access-5dt9w\") pod \"network-check-source-7b678d77c7-mhlgp\" (UID: \"9afaf590-634d-44c9-9149-a169bbbc6320\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mhlgp" Apr 16 14:00:34.486635 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:34.486568 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dt9w\" (UniqueName: \"kubernetes.io/projected/9afaf590-634d-44c9-9149-a169bbbc6320-kube-api-access-5dt9w\") pod \"network-check-source-7b678d77c7-mhlgp\" (UID: \"9afaf590-634d-44c9-9149-a169bbbc6320\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mhlgp" Apr 16 14:00:34.576615 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:34.576517 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-tvqcn\" (UID: \"fa688607-67ff-421d-baa1-1faca7c66d27\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" Apr 16 14:00:34.576787 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:34.576682 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:00:34.576787 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:34.576751 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls podName:fa688607-67ff-421d-baa1-1faca7c66d27 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:38.576731629 +0000 UTC m=+73.584967865 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls") pod "cluster-samples-operator-667775844f-tvqcn" (UID: "fa688607-67ff-421d-baa1-1faca7c66d27") : secret "samples-operator-tls" not found Apr 16 14:00:34.629952 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:34.629917 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mhlgp" Apr 16 14:00:34.677318 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:34.677270 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:34.677510 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:34.677386 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:34.677510 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:34.677497 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:00:34.677621 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:34.677578 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs podName:e39140fd-a9c8-42bc-8af9-0ddd8cd8addc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:38.677554197 +0000 UTC m=+73.685790453 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs") pod "router-default-5b8555f84-mbcn4" (UID: "e39140fd-a9c8-42bc-8af9-0ddd8cd8addc") : secret "router-metrics-certs-default" not found Apr 16 14:00:34.677621 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:34.677602 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle podName:e39140fd-a9c8-42bc-8af9-0ddd8cd8addc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:38.677592331 +0000 UTC m=+73.685828565 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle") pod "router-default-5b8555f84-mbcn4" (UID: "e39140fd-a9c8-42bc-8af9-0ddd8cd8addc") : configmap references non-existent config key: service-ca.crt Apr 16 14:00:34.773926 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:34.773888 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-mhlgp"] Apr 16 14:00:35.337534 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:00:35.337498 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9afaf590_634d_44c9_9149_a169bbbc6320.slice/crio-4c520d6815a1c84216d4a4905c9c1c9f250abd5161ff0f197e850cfc2de12b56 WatchSource:0}: Error finding container 4c520d6815a1c84216d4a4905c9c1c9f250abd5161ff0f197e850cfc2de12b56: Status 404 returned error can't find the container with id 4c520d6815a1c84216d4a4905c9c1c9f250abd5161ff0f197e850cfc2de12b56 Apr 16 14:00:35.836290 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:35.836217 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mhlgp" event={"ID":"9afaf590-634d-44c9-9149-a169bbbc6320","Type":"ContainerStarted","Data":"4c520d6815a1c84216d4a4905c9c1c9f250abd5161ff0f197e850cfc2de12b56"} Apr 16 14:00:36.777797 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:36.777764 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ws5bp" Apr 16 14:00:36.840022 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:36.839989 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mhlgp" event={"ID":"9afaf590-634d-44c9-9149-a169bbbc6320","Type":"ContainerStarted","Data":"8e82ae5da69afc25d3aca7f191e6666e1b99c8ae45a44293b4e15f4a439bd795"} Apr 16 14:00:36.841793 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:36.841765 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/0.log" Apr 16 14:00:36.841933 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:36.841807 2569 generic.go:358] "Generic (PLEG): container finished" podID="9f6305f8-dd82-4db8-91e9-4ddbc887813b" containerID="cab0c278afdc37778bf232def6627b09e78d16c00f0e4d7555d1ad377a4e5683" exitCode=255 Apr 16 14:00:36.841933 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:36.841879 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" event={"ID":"9f6305f8-dd82-4db8-91e9-4ddbc887813b","Type":"ContainerDied","Data":"cab0c278afdc37778bf232def6627b09e78d16c00f0e4d7555d1ad377a4e5683"} Apr 16 14:00:36.842137 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:36.842106 2569 scope.go:117] "RemoveContainer" containerID="cab0c278afdc37778bf232def6627b09e78d16c00f0e4d7555d1ad377a4e5683" Apr 16 14:00:36.845275 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:36.845234 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" event={"ID":"21db3010-f35e-486a-9584-0dc09d164c21","Type":"ContainerStarted","Data":"4da0eeccb0b84905869c94b89ae004c3849c283827805e49ef7ae73e3fb20af2"} Apr 16 14:00:36.858100 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:36.858042 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mhlgp" podStartSLOduration=2.858019131 podStartE2EDuration="2.858019131s" podCreationTimestamp="2026-04-16 14:00:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:00:36.857030745 +0000 UTC m=+71.865267021" watchObservedRunningTime="2026-04-16 14:00:36.858019131 +0000 UTC m=+71.866255387" Apr 16 14:00:36.879845 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:36.879795 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" podStartSLOduration=1.986571758 podStartE2EDuration="4.879779587s" podCreationTimestamp="2026-04-16 14:00:32 +0000 UTC" firstStartedPulling="2026-04-16 14:00:33.247511238 +0000 UTC m=+68.255747477" lastFinishedPulling="2026-04-16 14:00:36.140719071 +0000 UTC m=+71.148955306" observedRunningTime="2026-04-16 14:00:36.878458235 +0000 UTC m=+71.886694493" watchObservedRunningTime="2026-04-16 14:00:36.879779587 +0000 UTC m=+71.888015842" Apr 16 14:00:37.849582 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:37.849549 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/1.log" Apr 16 14:00:37.850443 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:37.850418 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/0.log" Apr 16 14:00:37.850576 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:37.850461 2569 generic.go:358] "Generic (PLEG): container finished" podID="9f6305f8-dd82-4db8-91e9-4ddbc887813b" containerID="95f75d72c89c6b61acc7a86227b77a4b7343d39f36dedb16e314cfeb81b1ffa5" exitCode=255 Apr 16 14:00:37.850576 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:37.850542 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" event={"ID":"9f6305f8-dd82-4db8-91e9-4ddbc887813b","Type":"ContainerDied","Data":"95f75d72c89c6b61acc7a86227b77a4b7343d39f36dedb16e314cfeb81b1ffa5"} Apr 16 14:00:37.850685 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:37.850606 2569 scope.go:117] "RemoveContainer" containerID="cab0c278afdc37778bf232def6627b09e78d16c00f0e4d7555d1ad377a4e5683" Apr 16 14:00:37.855324 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:37.855299 2569 scope.go:117] "RemoveContainer" containerID="95f75d72c89c6b61acc7a86227b77a4b7343d39f36dedb16e314cfeb81b1ffa5" Apr 16 14:00:37.855483 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:37.855468 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-7vwhw_openshift-console-operator(9f6305f8-dd82-4db8-91e9-4ddbc887813b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" podUID="9f6305f8-dd82-4db8-91e9-4ddbc887813b" Apr 16 14:00:37.887592 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:37.887570 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4wsvh_b248dbae-841e-4eb7-a41e-cc738673d882/dns-node-resolver/0.log" Apr 16 14:00:38.609388 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:38.609347 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-tvqcn\" (UID: \"fa688607-67ff-421d-baa1-1faca7c66d27\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" Apr 16 14:00:38.609576 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:38.609473 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:00:38.609576 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:38.609544 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls podName:fa688607-67ff-421d-baa1-1faca7c66d27 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:46.609526224 +0000 UTC m=+81.617762470 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls") pod "cluster-samples-operator-667775844f-tvqcn" (UID: "fa688607-67ff-421d-baa1-1faca7c66d27") : secret "samples-operator-tls" not found Apr 16 14:00:38.710105 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:38.710059 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:38.710299 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:38.710209 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:00:38.710299 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:38.710223 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:38.710379 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:38.710302 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs podName:e39140fd-a9c8-42bc-8af9-0ddd8cd8addc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:46.710285631 +0000 UTC m=+81.718521864 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs") pod "router-default-5b8555f84-mbcn4" (UID: "e39140fd-a9c8-42bc-8af9-0ddd8cd8addc") : secret "router-metrics-certs-default" not found Apr 16 14:00:38.710379 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:38.710371 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle podName:e39140fd-a9c8-42bc-8af9-0ddd8cd8addc nodeName:}" failed. No retries permitted until 2026-04-16 14:00:46.710357827 +0000 UTC m=+81.718594068 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle") pod "router-default-5b8555f84-mbcn4" (UID: "e39140fd-a9c8-42bc-8af9-0ddd8cd8addc") : configmap references non-existent config key: service-ca.crt Apr 16 14:00:38.853949 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:38.853922 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/1.log" Apr 16 14:00:38.854309 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:38.854238 2569 scope.go:117] "RemoveContainer" containerID="95f75d72c89c6b61acc7a86227b77a4b7343d39f36dedb16e314cfeb81b1ffa5" Apr 16 14:00:38.854446 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:38.854430 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-7vwhw_openshift-console-operator(9f6305f8-dd82-4db8-91e9-4ddbc887813b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" podUID="9f6305f8-dd82-4db8-91e9-4ddbc887813b" Apr 16 14:00:39.090054 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:39.090027 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-w582s_d04a5bf9-7e36-4375-aad1-26af61c2c344/node-ca/0.log" Apr 16 14:00:41.195852 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:41.195808 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:41.195852 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:41.195855 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:00:41.196292 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:41.196193 2569 scope.go:117] "RemoveContainer" containerID="95f75d72c89c6b61acc7a86227b77a4b7343d39f36dedb16e314cfeb81b1ffa5" Apr 16 14:00:41.196406 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:41.196385 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-7vwhw_openshift-console-operator(9f6305f8-dd82-4db8-91e9-4ddbc887813b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" podUID="9f6305f8-dd82-4db8-91e9-4ddbc887813b" Apr 16 14:00:46.678317 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:46.678179 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-tvqcn\" (UID: \"fa688607-67ff-421d-baa1-1faca7c66d27\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" Apr 16 14:00:46.680741 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:46.680713 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa688607-67ff-421d-baa1-1faca7c66d27-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-tvqcn\" (UID: \"fa688607-67ff-421d-baa1-1faca7c66d27\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" Apr 16 14:00:46.778787 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:46.778752 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:46.778959 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:46.778814 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:00:46.778959 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:46.778925 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle podName:e39140fd-a9c8-42bc-8af9-0ddd8cd8addc nodeName:}" failed. No retries permitted until 2026-04-16 14:01:02.77890562 +0000 UTC m=+97.787141872 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle") pod "router-default-5b8555f84-mbcn4" (UID: "e39140fd-a9c8-42bc-8af9-0ddd8cd8addc") : configmap references non-existent config key: service-ca.crt Apr 16 14:00:46.778959 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:46.778927 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:00:46.779091 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:46.778964 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs podName:e39140fd-a9c8-42bc-8af9-0ddd8cd8addc nodeName:}" failed. No retries permitted until 2026-04-16 14:01:02.77895503 +0000 UTC m=+97.787191263 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs") pod "router-default-5b8555f84-mbcn4" (UID: "e39140fd-a9c8-42bc-8af9-0ddd8cd8addc") : secret "router-metrics-certs-default" not found Apr 16 14:00:46.800544 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:46.800515 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" Apr 16 14:00:46.919427 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:46.919396 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn"] Apr 16 14:00:47.874501 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:47.874468 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" event={"ID":"fa688607-67ff-421d-baa1-1faca7c66d27","Type":"ContainerStarted","Data":"c54f58354f5e4540f308bb742f1c27f778248f7cf6c1896d44a3cd07c387f4f6"} Apr 16 14:00:50.883137 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:50.883099 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" event={"ID":"fa688607-67ff-421d-baa1-1faca7c66d27","Type":"ContainerStarted","Data":"4c3175a660ab07624e0d54e262e67dc210e2d7dcf8a83d4d49c802aa58fbd67d"} Apr 16 14:00:50.883137 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:50.883139 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" event={"ID":"fa688607-67ff-421d-baa1-1faca7c66d27","Type":"ContainerStarted","Data":"461564172dcd24fc248fc639e246afb64018826a86d01644afa0a37e0666b8f8"} Apr 16 14:00:50.901566 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:50.901521 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-tvqcn" podStartSLOduration=17.56770807 podStartE2EDuration="20.901506484s" podCreationTimestamp="2026-04-16 14:00:30 +0000 UTC" firstStartedPulling="2026-04-16 14:00:46.957928554 +0000 UTC m=+81.966164790" lastFinishedPulling="2026-04-16 14:00:50.291726956 +0000 UTC m=+85.299963204" observedRunningTime="2026-04-16 14:00:50.900534238 +0000 UTC m=+85.908770493" watchObservedRunningTime="2026-04-16 14:00:50.901506484 +0000 UTC m=+85.909742739" Apr 16 14:00:53.598979 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:53.598949 2569 scope.go:117] "RemoveContainer" containerID="95f75d72c89c6b61acc7a86227b77a4b7343d39f36dedb16e314cfeb81b1ffa5" Apr 16 14:00:53.893461 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:53.893381 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:00:53.893778 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:53.893764 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/1.log" Apr 16 14:00:53.893827 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:53.893796 2569 generic.go:358] "Generic (PLEG): container finished" podID="9f6305f8-dd82-4db8-91e9-4ddbc887813b" containerID="328cc128e6632aa104a8b540877e4e9d2a3737ab6544a5c328a0734800fe2d45" exitCode=255 Apr 16 14:00:53.893857 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:53.893827 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" event={"ID":"9f6305f8-dd82-4db8-91e9-4ddbc887813b","Type":"ContainerDied","Data":"328cc128e6632aa104a8b540877e4e9d2a3737ab6544a5c328a0734800fe2d45"} Apr 16 14:00:53.893857 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:53.893854 2569 scope.go:117] "RemoveContainer" containerID="95f75d72c89c6b61acc7a86227b77a4b7343d39f36dedb16e314cfeb81b1ffa5" Apr 16 14:00:53.894220 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:53.894203 2569 scope.go:117] "RemoveContainer" containerID="328cc128e6632aa104a8b540877e4e9d2a3737ab6544a5c328a0734800fe2d45" Apr 16 14:00:53.894428 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:00:53.894411 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-7vwhw_openshift-console-operator(9f6305f8-dd82-4db8-91e9-4ddbc887813b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" podUID="9f6305f8-dd82-4db8-91e9-4ddbc887813b" Apr 16 14:00:54.897881 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:54.897851 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:00:59.634030 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.633993 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-n22cn"] Apr 16 14:00:59.638157 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.638137 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.642969 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.642935 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:00:59.643126 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.643035 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:00:59.643126 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.643049 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-sdpvr\"" Apr 16 14:00:59.643126 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.643065 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:00:59.643126 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.643088 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:00:59.651356 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.651333 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n22cn"] Apr 16 14:00:59.724795 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.724761 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7df7868c7f-q22rr"] Apr 16 14:00:59.727781 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.727766 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.730948 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.730923 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-khk5g\"" Apr 16 14:00:59.731086 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.730964 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:00:59.731235 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.731222 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:00:59.731309 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.731295 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:00:59.737285 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.737243 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:00:59.742775 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.742752 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7df7868c7f-q22rr"] Apr 16 14:00:59.785047 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.785018 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8b7545ee-aad8-4cc3-876d-a3a9c72d72c9-data-volume\") pod \"insights-runtime-extractor-n22cn\" (UID: \"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9\") " pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.785047 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.785049 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8b7545ee-aad8-4cc3-876d-a3a9c72d72c9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n22cn\" (UID: \"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9\") " pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.785234 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.785071 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8b7545ee-aad8-4cc3-876d-a3a9c72d72c9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n22cn\" (UID: \"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9\") " pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.785234 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.785096 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8b7545ee-aad8-4cc3-876d-a3a9c72d72c9-crio-socket\") pod \"insights-runtime-extractor-n22cn\" (UID: \"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9\") " pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.785234 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.785200 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kslsg\" (UniqueName: \"kubernetes.io/projected/8b7545ee-aad8-4cc3-876d-a3a9c72d72c9-kube-api-access-kslsg\") pod \"insights-runtime-extractor-n22cn\" (UID: \"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9\") " pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.886279 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.886161 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kslsg\" (UniqueName: \"kubernetes.io/projected/8b7545ee-aad8-4cc3-876d-a3a9c72d72c9-kube-api-access-kslsg\") pod \"insights-runtime-extractor-n22cn\" (UID: \"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9\") " pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.886279 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.886211 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ba3df13-9e23-42a3-86a2-4929bfedb89a-bound-sa-token\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.886520 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.886393 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6ba3df13-9e23-42a3-86a2-4929bfedb89a-registry-certificates\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.886520 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.886428 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6ba3df13-9e23-42a3-86a2-4929bfedb89a-installation-pull-secrets\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.886520 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.886478 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6ba3df13-9e23-42a3-86a2-4929bfedb89a-image-registry-private-configuration\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.886520 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.886507 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-696lh\" (UniqueName: \"kubernetes.io/projected/6ba3df13-9e23-42a3-86a2-4929bfedb89a-kube-api-access-696lh\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.886714 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.886536 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ba3df13-9e23-42a3-86a2-4929bfedb89a-trusted-ca\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.886714 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.886566 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6ba3df13-9e23-42a3-86a2-4929bfedb89a-ca-trust-extracted\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.886714 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.886598 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8b7545ee-aad8-4cc3-876d-a3a9c72d72c9-data-volume\") pod \"insights-runtime-extractor-n22cn\" (UID: \"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9\") " pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.886714 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.886652 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6ba3df13-9e23-42a3-86a2-4929bfedb89a-registry-tls\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.886899 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.886741 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8b7545ee-aad8-4cc3-876d-a3a9c72d72c9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n22cn\" (UID: \"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9\") " pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.886899 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.886777 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8b7545ee-aad8-4cc3-876d-a3a9c72d72c9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n22cn\" (UID: \"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9\") " pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.886899 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.886809 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8b7545ee-aad8-4cc3-876d-a3a9c72d72c9-crio-socket\") pod \"insights-runtime-extractor-n22cn\" (UID: \"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9\") " pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.886899 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.886866 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8b7545ee-aad8-4cc3-876d-a3a9c72d72c9-data-volume\") pod \"insights-runtime-extractor-n22cn\" (UID: \"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9\") " pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.887045 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.886904 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8b7545ee-aad8-4cc3-876d-a3a9c72d72c9-crio-socket\") pod \"insights-runtime-extractor-n22cn\" (UID: \"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9\") " pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.887199 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.887181 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8b7545ee-aad8-4cc3-876d-a3a9c72d72c9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n22cn\" (UID: \"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9\") " pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.888989 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.888969 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8b7545ee-aad8-4cc3-876d-a3a9c72d72c9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n22cn\" (UID: \"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9\") " pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.898383 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.898357 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kslsg\" (UniqueName: \"kubernetes.io/projected/8b7545ee-aad8-4cc3-876d-a3a9c72d72c9-kube-api-access-kslsg\") pod \"insights-runtime-extractor-n22cn\" (UID: \"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9\") " pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.947690 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.947656 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n22cn" Apr 16 14:00:59.988028 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.987990 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ba3df13-9e23-42a3-86a2-4929bfedb89a-bound-sa-token\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.988212 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.988040 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6ba3df13-9e23-42a3-86a2-4929bfedb89a-registry-certificates\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.988212 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.988069 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6ba3df13-9e23-42a3-86a2-4929bfedb89a-installation-pull-secrets\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.988212 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.988095 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6ba3df13-9e23-42a3-86a2-4929bfedb89a-image-registry-private-configuration\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.988212 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.988114 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-696lh\" (UniqueName: \"kubernetes.io/projected/6ba3df13-9e23-42a3-86a2-4929bfedb89a-kube-api-access-696lh\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.988212 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.988142 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ba3df13-9e23-42a3-86a2-4929bfedb89a-trusted-ca\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.988212 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.988173 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6ba3df13-9e23-42a3-86a2-4929bfedb89a-ca-trust-extracted\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.988497 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.988219 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6ba3df13-9e23-42a3-86a2-4929bfedb89a-registry-tls\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.988811 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.988781 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6ba3df13-9e23-42a3-86a2-4929bfedb89a-ca-trust-extracted\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.989204 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.989178 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6ba3df13-9e23-42a3-86a2-4929bfedb89a-registry-certificates\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.989516 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.989490 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ba3df13-9e23-42a3-86a2-4929bfedb89a-trusted-ca\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.991232 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.991134 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6ba3df13-9e23-42a3-86a2-4929bfedb89a-image-registry-private-configuration\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.991232 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.991139 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6ba3df13-9e23-42a3-86a2-4929bfedb89a-installation-pull-secrets\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.992035 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.991678 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6ba3df13-9e23-42a3-86a2-4929bfedb89a-registry-tls\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.997167 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.997138 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ba3df13-9e23-42a3-86a2-4929bfedb89a-bound-sa-token\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:00:59.998238 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:00:59.998189 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-696lh\" (UniqueName: \"kubernetes.io/projected/6ba3df13-9e23-42a3-86a2-4929bfedb89a-kube-api-access-696lh\") pod \"image-registry-7df7868c7f-q22rr\" (UID: \"6ba3df13-9e23-42a3-86a2-4929bfedb89a\") " pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:01:00.037963 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:00.037034 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:01:00.066909 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:00.066870 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n22cn"] Apr 16 14:01:00.071086 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:01:00.071056 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b7545ee_aad8_4cc3_876d_a3a9c72d72c9.slice/crio-955ab9011f11224f5887edee592817263bb09e52e19cfa5331c4822849c8dfe9 WatchSource:0}: Error finding container 955ab9011f11224f5887edee592817263bb09e52e19cfa5331c4822849c8dfe9: Status 404 returned error can't find the container with id 955ab9011f11224f5887edee592817263bb09e52e19cfa5331c4822849c8dfe9 Apr 16 14:01:00.164879 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:00.164852 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7df7868c7f-q22rr"] Apr 16 14:01:00.168529 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:01:00.168497 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ba3df13_9e23_42a3_86a2_4929bfedb89a.slice/crio-50a1198d3a6b1a4e9e00eb8753be1d8e91832eb1c94f97807441af568b3c8dc4 WatchSource:0}: Error finding container 50a1198d3a6b1a4e9e00eb8753be1d8e91832eb1c94f97807441af568b3c8dc4: Status 404 returned error can't find the container with id 50a1198d3a6b1a4e9e00eb8753be1d8e91832eb1c94f97807441af568b3c8dc4 Apr 16 14:01:00.918007 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:00.917970 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" event={"ID":"6ba3df13-9e23-42a3-86a2-4929bfedb89a","Type":"ContainerStarted","Data":"246c4c3fe9f8c2dcd845af9ada62a606453fab4857f836ff19e6f683a3d49350"} Apr 16 14:01:00.918480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:00.918016 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" event={"ID":"6ba3df13-9e23-42a3-86a2-4929bfedb89a","Type":"ContainerStarted","Data":"50a1198d3a6b1a4e9e00eb8753be1d8e91832eb1c94f97807441af568b3c8dc4"} Apr 16 14:01:00.918480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:00.918126 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:01:00.919634 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:00.919602 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n22cn" event={"ID":"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9","Type":"ContainerStarted","Data":"0ae69b124316873c4d1bf0cd425ab04f3c28496e3b784d9e8b35d58f68e7abad"} Apr 16 14:01:00.919719 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:00.919635 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n22cn" event={"ID":"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9","Type":"ContainerStarted","Data":"955ab9011f11224f5887edee592817263bb09e52e19cfa5331c4822849c8dfe9"} Apr 16 14:01:00.948143 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:00.948090 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" podStartSLOduration=1.948074336 podStartE2EDuration="1.948074336s" podCreationTimestamp="2026-04-16 14:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:00.947993737 +0000 UTC m=+95.956229992" watchObservedRunningTime="2026-04-16 14:01:00.948074336 +0000 UTC m=+95.956310582" Apr 16 14:01:01.195954 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:01.195856 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:01:01.195954 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:01.195905 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:01:01.196340 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:01.196324 2569 scope.go:117] "RemoveContainer" containerID="328cc128e6632aa104a8b540877e4e9d2a3737ab6544a5c328a0734800fe2d45" Apr 16 14:01:01.196528 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:01:01.196510 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-7vwhw_openshift-console-operator(9f6305f8-dd82-4db8-91e9-4ddbc887813b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" podUID="9f6305f8-dd82-4db8-91e9-4ddbc887813b" Apr 16 14:01:01.924581 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:01.924544 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n22cn" event={"ID":"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9","Type":"ContainerStarted","Data":"6b12de087794e0c71afc7c30856085a055dbc4752bdbcf07a3bfa776db0481f0"} Apr 16 14:01:02.205729 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.205638 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 14:01:02.208340 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.208314 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fb3cee7-5cda-4d24-a176-260852fbda2c-metrics-tls\") pod \"dns-default-xstwc\" (UID: \"5fb3cee7-5cda-4d24-a176-260852fbda2c\") " pod="openshift-dns/dns-default-xstwc" Apr 16 14:01:02.244202 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.244169 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wjgsd\"" Apr 16 14:01:02.251740 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.251705 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xstwc" Apr 16 14:01:02.307164 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.307000 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert\") pod \"ingress-canary-q5pvc\" (UID: \"7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393\") " pod="openshift-ingress-canary/ingress-canary-q5pvc" Apr 16 14:01:02.310220 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.310193 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393-cert\") pod \"ingress-canary-q5pvc\" (UID: \"7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393\") " pod="openshift-ingress-canary/ingress-canary-q5pvc" Apr 16 14:01:02.385167 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.385136 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xstwc"] Apr 16 14:01:02.554449 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.554358 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pt57w\"" Apr 16 14:01:02.561882 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.561854 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q5pvc" Apr 16 14:01:02.705580 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:01:02.705541 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fb3cee7_5cda_4d24_a176_260852fbda2c.slice/crio-43ed1004683f6ec06a6a5a04f14376b3300e3e9916dc3a67eb2952305ae859bc WatchSource:0}: Error finding container 43ed1004683f6ec06a6a5a04f14376b3300e3e9916dc3a67eb2952305ae859bc: Status 404 returned error can't find the container with id 43ed1004683f6ec06a6a5a04f14376b3300e3e9916dc3a67eb2952305ae859bc Apr 16 14:01:02.812911 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.812876 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:01:02.813039 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.812998 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:01:02.813727 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.813699 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-service-ca-bundle\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:01:02.815177 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.815150 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e39140fd-a9c8-42bc-8af9-0ddd8cd8addc-metrics-certs\") pod \"router-default-5b8555f84-mbcn4\" (UID: \"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc\") " pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:01:02.825896 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.825871 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q5pvc"] Apr 16 14:01:02.828690 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:01:02.828666 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e8b38d3_0b12_4b9f_9c2c_d79c4a4aa393.slice/crio-dc6deaadb28791ba02b548088cf9080e47fdaed3f94d1e54a42b96691f509dc7 WatchSource:0}: Error finding container dc6deaadb28791ba02b548088cf9080e47fdaed3f94d1e54a42b96691f509dc7: Status 404 returned error can't find the container with id dc6deaadb28791ba02b548088cf9080e47fdaed3f94d1e54a42b96691f509dc7 Apr 16 14:01:02.929316 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.929281 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n22cn" event={"ID":"8b7545ee-aad8-4cc3-876d-a3a9c72d72c9","Type":"ContainerStarted","Data":"dc40d29a97ad3001faedf747de617f5f81022beaf887f51c05e6ddc8b6fa10e8"} Apr 16 14:01:02.930366 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.930337 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q5pvc" event={"ID":"7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393","Type":"ContainerStarted","Data":"dc6deaadb28791ba02b548088cf9080e47fdaed3f94d1e54a42b96691f509dc7"} Apr 16 14:01:02.931331 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.931310 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xstwc" event={"ID":"5fb3cee7-5cda-4d24-a176-260852fbda2c","Type":"ContainerStarted","Data":"43ed1004683f6ec06a6a5a04f14376b3300e3e9916dc3a67eb2952305ae859bc"} Apr 16 14:01:02.951069 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:02.951017 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-n22cn" podStartSLOduration=1.343539169 podStartE2EDuration="3.95099883s" podCreationTimestamp="2026-04-16 14:00:59 +0000 UTC" firstStartedPulling="2026-04-16 14:01:00.147408685 +0000 UTC m=+95.155644918" lastFinishedPulling="2026-04-16 14:01:02.754868346 +0000 UTC m=+97.763104579" observedRunningTime="2026-04-16 14:01:02.950387441 +0000 UTC m=+97.958623697" watchObservedRunningTime="2026-04-16 14:01:02.95099883 +0000 UTC m=+97.959235082" Apr 16 14:01:03.086063 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:03.085972 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:01:03.225761 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:03.225727 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5b8555f84-mbcn4"] Apr 16 14:01:03.228757 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:01:03.228730 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode39140fd_a9c8_42bc_8af9_0ddd8cd8addc.slice/crio-2b2112cf02903266bf94df4d61c5733d90824fe30f07b8d5fa87c2cf6a921b42 WatchSource:0}: Error finding container 2b2112cf02903266bf94df4d61c5733d90824fe30f07b8d5fa87c2cf6a921b42: Status 404 returned error can't find the container with id 2b2112cf02903266bf94df4d61c5733d90824fe30f07b8d5fa87c2cf6a921b42 Apr 16 14:01:03.939114 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:03.939064 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5b8555f84-mbcn4" event={"ID":"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc","Type":"ContainerStarted","Data":"8652a5802214d5330ed7249b8a7f70cb630eda7e705e8520d899687bab398d6c"} Apr 16 14:01:03.939494 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:03.939121 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5b8555f84-mbcn4" event={"ID":"e39140fd-a9c8-42bc-8af9-0ddd8cd8addc","Type":"ContainerStarted","Data":"2b2112cf02903266bf94df4d61c5733d90824fe30f07b8d5fa87c2cf6a921b42"} Apr 16 14:01:03.960908 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:03.960859 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5b8555f84-mbcn4" podStartSLOduration=33.960844645 podStartE2EDuration="33.960844645s" podCreationTimestamp="2026-04-16 14:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:03.959616494 +0000 UTC m=+98.967852832" watchObservedRunningTime="2026-04-16 14:01:03.960844645 +0000 UTC m=+98.969080903" Apr 16 14:01:04.087209 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:04.087166 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:01:04.090199 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:04.090175 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:01:04.942460 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:04.942426 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:01:04.944049 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:04.944013 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5b8555f84-mbcn4" Apr 16 14:01:05.551392 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:05.551361 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p5mcv"] Apr 16 14:01:05.554730 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:05.554706 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p5mcv" Apr 16 14:01:05.558146 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:05.558124 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-f6ch6\"" Apr 16 14:01:05.558300 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:05.558152 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 14:01:05.565567 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:05.565532 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p5mcv"] Apr 16 14:01:05.739843 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:05.739803 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c80c5960-118a-4109-8282-3f4b1769aa2f-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-p5mcv\" (UID: \"c80c5960-118a-4109-8282-3f4b1769aa2f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p5mcv" Apr 16 14:01:05.841360 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:05.841273 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c80c5960-118a-4109-8282-3f4b1769aa2f-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-p5mcv\" (UID: \"c80c5960-118a-4109-8282-3f4b1769aa2f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p5mcv" Apr 16 14:01:05.844033 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:05.844006 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c80c5960-118a-4109-8282-3f4b1769aa2f-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-p5mcv\" (UID: \"c80c5960-118a-4109-8282-3f4b1769aa2f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p5mcv" Apr 16 14:01:05.863996 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:05.863966 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p5mcv" Apr 16 14:01:05.946023 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:05.945990 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q5pvc" event={"ID":"7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393","Type":"ContainerStarted","Data":"2e302ed685d9f28866e621ca741032b959bb8ba3eed7dc89690a76a620418071"} Apr 16 14:01:05.969767 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:05.969715 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-q5pvc" podStartSLOduration=65.753221496 podStartE2EDuration="1m7.969696841s" podCreationTimestamp="2026-04-16 13:59:58 +0000 UTC" firstStartedPulling="2026-04-16 14:01:02.830442263 +0000 UTC m=+97.838678495" lastFinishedPulling="2026-04-16 14:01:05.046917608 +0000 UTC m=+100.055153840" observedRunningTime="2026-04-16 14:01:05.967589655 +0000 UTC m=+100.975825910" watchObservedRunningTime="2026-04-16 14:01:05.969696841 +0000 UTC m=+100.977933108" Apr 16 14:01:06.044227 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:06.044167 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p5mcv"] Apr 16 14:01:06.047547 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:01:06.047511 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc80c5960_118a_4109_8282_3f4b1769aa2f.slice/crio-fe8a11cf72f3f1f7ca6f0be952333f20bde07ee2451f1ea7bcb52f9f91f0520c WatchSource:0}: Error finding container fe8a11cf72f3f1f7ca6f0be952333f20bde07ee2451f1ea7bcb52f9f91f0520c: Status 404 returned error can't find the container with id fe8a11cf72f3f1f7ca6f0be952333f20bde07ee2451f1ea7bcb52f9f91f0520c Apr 16 14:01:06.951021 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:06.950977 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p5mcv" event={"ID":"c80c5960-118a-4109-8282-3f4b1769aa2f","Type":"ContainerStarted","Data":"fe8a11cf72f3f1f7ca6f0be952333f20bde07ee2451f1ea7bcb52f9f91f0520c"} Apr 16 14:01:06.952889 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:06.952832 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xstwc" event={"ID":"5fb3cee7-5cda-4d24-a176-260852fbda2c","Type":"ContainerStarted","Data":"157e734fa0fb1e8bc6148e4d92556acd9e5656e09c7b36ebe0da1af6bafa6c65"} Apr 16 14:01:06.952889 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:06.952869 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xstwc" event={"ID":"5fb3cee7-5cda-4d24-a176-260852fbda2c","Type":"ContainerStarted","Data":"bb30fa1840dfe8f030af3dd86039400afdf81106bfc7ae76f9d2bdf38068c667"} Apr 16 14:01:06.972896 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:06.972842 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xstwc" podStartSLOduration=65.711899262 podStartE2EDuration="1m8.97282786s" podCreationTimestamp="2026-04-16 13:59:58 +0000 UTC" firstStartedPulling="2026-04-16 14:01:02.707814894 +0000 UTC m=+97.716051128" lastFinishedPulling="2026-04-16 14:01:05.96874348 +0000 UTC m=+100.976979726" observedRunningTime="2026-04-16 14:01:06.972326095 +0000 UTC m=+101.980562353" watchObservedRunningTime="2026-04-16 14:01:06.97282786 +0000 UTC m=+101.981064120" Apr 16 14:01:07.957579 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:07.957543 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p5mcv" event={"ID":"c80c5960-118a-4109-8282-3f4b1769aa2f","Type":"ContainerStarted","Data":"af8ab442bfb9543e22a59a9d82d86270e2b720d1d1423a75f5f47207f72bdff8"} Apr 16 14:01:07.958002 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:07.957765 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p5mcv" Apr 16 14:01:07.958002 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:07.957786 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xstwc" Apr 16 14:01:07.962845 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:07.962824 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p5mcv" Apr 16 14:01:07.977761 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:07.977709 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-p5mcv" podStartSLOduration=1.7478577469999999 podStartE2EDuration="2.977696758s" podCreationTimestamp="2026-04-16 14:01:05 +0000 UTC" firstStartedPulling="2026-04-16 14:01:06.049827024 +0000 UTC m=+101.058063258" lastFinishedPulling="2026-04-16 14:01:07.279666037 +0000 UTC m=+102.287902269" observedRunningTime="2026-04-16 14:01:07.976808359 +0000 UTC m=+102.985044614" watchObservedRunningTime="2026-04-16 14:01:07.977696758 +0000 UTC m=+102.985933012" Apr 16 14:01:14.039314 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.039283 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-w7qjp"] Apr 16 14:01:14.041829 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.041810 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.044853 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.044834 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-w66d6\"" Apr 16 14:01:14.045044 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.045023 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:01:14.045429 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.045406 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:01:14.045914 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.045892 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:01:14.046016 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.045975 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:01:14.046016 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.046004 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:01:14.046125 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.046032 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:01:14.209776 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.209736 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-wtmp\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.209952 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.209791 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-metrics-client-ca\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.209952 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.209869 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-tls\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.209952 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.209897 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-root\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.210050 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.209991 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-sys\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.210050 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.210018 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-textfile\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.210050 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.210046 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-accelerators-collector-config\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.210142 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.210068 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.210142 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.210087 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj4zf\" (UniqueName: \"kubernetes.io/projected/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-kube-api-access-sj4zf\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.311055 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.310977 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-accelerators-collector-config\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.311055 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.311018 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.311055 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.311040 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sj4zf\" (UniqueName: \"kubernetes.io/projected/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-kube-api-access-sj4zf\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.311242 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.311223 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-wtmp\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.311326 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.311310 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-metrics-client-ca\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.311371 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.311345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-tls\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.311405 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.311366 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-wtmp\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.311405 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.311378 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-root\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.311476 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.311416 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-root\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.311476 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.311434 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-sys\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.311476 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.311461 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-textfile\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.311562 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.311486 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-sys\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.311562 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:01:14.311507 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:01:14.311619 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:01:14.311569 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-tls podName:ea67638c-bae3-407e-bb50-aeeae5ea8f7d nodeName:}" failed. No retries permitted until 2026-04-16 14:01:14.811548505 +0000 UTC m=+109.819784740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-tls") pod "node-exporter-w7qjp" (UID: "ea67638c-bae3-407e-bb50-aeeae5ea8f7d") : secret "node-exporter-tls" not found Apr 16 14:01:14.312027 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.312003 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-textfile\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.312129 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.312090 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-accelerators-collector-config\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.312216 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.312143 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-metrics-client-ca\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.313899 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.313880 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.321494 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.321474 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj4zf\" (UniqueName: \"kubernetes.io/projected/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-kube-api-access-sj4zf\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.598797 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.598710 2569 scope.go:117] "RemoveContainer" containerID="328cc128e6632aa104a8b540877e4e9d2a3737ab6544a5c328a0734800fe2d45" Apr 16 14:01:14.815025 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.814990 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-tls\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:14.815229 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:01:14.815165 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:01:14.815331 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:01:14.815246 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-tls podName:ea67638c-bae3-407e-bb50-aeeae5ea8f7d nodeName:}" failed. No retries permitted until 2026-04-16 14:01:15.81522456 +0000 UTC m=+110.823460793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-tls") pod "node-exporter-w7qjp" (UID: "ea67638c-bae3-407e-bb50-aeeae5ea8f7d") : secret "node-exporter-tls" not found Apr 16 14:01:14.981168 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.981140 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:01:14.981354 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.981235 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" event={"ID":"9f6305f8-dd82-4db8-91e9-4ddbc887813b","Type":"ContainerStarted","Data":"ccec7fec618e8ac0a93afbc222da72a4af281199d15092c817424954d87678e9"} Apr 16 14:01:14.981553 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.981521 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:01:14.999734 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:14.999686 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" podStartSLOduration=40.169322362 podStartE2EDuration="44.999672619s" podCreationTimestamp="2026-04-16 14:00:30 +0000 UTC" firstStartedPulling="2026-04-16 14:00:31.315381971 +0000 UTC m=+66.323618204" lastFinishedPulling="2026-04-16 14:00:36.145732224 +0000 UTC m=+71.153968461" observedRunningTime="2026-04-16 14:01:14.997636687 +0000 UTC m=+110.005872942" watchObservedRunningTime="2026-04-16 14:01:14.999672619 +0000 UTC m=+110.007908874" Apr 16 14:01:15.160000 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.159963 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:01:15.162805 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.162788 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.165461 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.165438 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 14:01:15.165871 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.165852 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 14:01:15.166117 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.166101 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 14:01:15.166117 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.166106 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 14:01:15.166585 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.166505 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 14:01:15.166585 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.166513 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 14:01:15.166585 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.166542 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-kp9lr\"" Apr 16 14:01:15.167522 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.167501 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 14:01:15.167635 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.167530 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 14:01:15.167635 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.167547 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 14:01:15.180887 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.180852 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:01:15.217860 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.217828 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.218015 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.217871 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.218015 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.217900 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.218015 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.217941 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.218015 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.217984 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf9hd\" (UniqueName: \"kubernetes.io/projected/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-kube-api-access-rf9hd\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.218143 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.218029 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.218143 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.218065 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.218143 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.218092 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.218143 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.218122 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-config-out\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.218328 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.218161 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.218328 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.218241 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-config-volume\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.218328 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.218290 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.218417 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.218326 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-web-config\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.318766 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.318669 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-config-out\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.318766 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.318712 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.318985 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.318768 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-config-volume\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.318985 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.318794 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.318985 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.318830 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-web-config\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.318985 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.318868 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.318985 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.318894 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.318985 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.318918 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.318985 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.318946 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.319392 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.318992 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rf9hd\" (UniqueName: \"kubernetes.io/projected/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-kube-api-access-rf9hd\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.319392 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.319022 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.319392 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.319046 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.319392 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.319074 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.319609 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:01:15.319558 2569 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 14:01:15.319663 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:01:15.319629 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-main-tls podName:6c8e5ea7-027c-46cd-a8e8-f50fb343bc94 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:15.81960847 +0000 UTC m=+110.827844717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94") : secret "alertmanager-main-tls" not found Apr 16 14:01:15.319831 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.319805 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.320155 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.320123 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.320694 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.320662 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.321842 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.321813 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.322011 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.321986 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-config-out\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.322829 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.322568 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.322829 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.322597 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-config-volume\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.322829 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.322743 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-web-config\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.323058 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.322887 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.323700 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.323681 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.323949 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.323929 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.330564 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.330539 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf9hd\" (UniqueName: \"kubernetes.io/projected/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-kube-api-access-rf9hd\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.633435 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.633353 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-7vwhw" Apr 16 14:01:15.823756 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.823720 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.823756 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.823760 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-tls\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:15.826415 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.826383 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:15.826514 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.826447 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ea67638c-bae3-407e-bb50-aeeae5ea8f7d-node-exporter-tls\") pod \"node-exporter-w7qjp\" (UID: \"ea67638c-bae3-407e-bb50-aeeae5ea8f7d\") " pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:15.851404 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.851375 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-w7qjp" Apr 16 14:01:15.859463 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:01:15.859431 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea67638c_bae3_407e_bb50_aeeae5ea8f7d.slice/crio-b1c29cc695d346fd7b1b11411e8120c0a2abcfbc7ba010c0198e8d0a2a3bd20a WatchSource:0}: Error finding container b1c29cc695d346fd7b1b11411e8120c0a2abcfbc7ba010c0198e8d0a2a3bd20a: Status 404 returned error can't find the container with id b1c29cc695d346fd7b1b11411e8120c0a2abcfbc7ba010c0198e8d0a2a3bd20a Apr 16 14:01:15.985051 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:15.985017 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w7qjp" event={"ID":"ea67638c-bae3-407e-bb50-aeeae5ea8f7d","Type":"ContainerStarted","Data":"b1c29cc695d346fd7b1b11411e8120c0a2abcfbc7ba010c0198e8d0a2a3bd20a"} Apr 16 14:01:16.072935 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:16.072899 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:16.203404 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:16.203378 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:01:16.205395 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:01:16.205371 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c8e5ea7_027c_46cd_a8e8_f50fb343bc94.slice/crio-61542d519385aadf0f933d58b664e7ea6fbc440e55cf9841ecdcf21612dbcf95 WatchSource:0}: Error finding container 61542d519385aadf0f933d58b664e7ea6fbc440e55cf9841ecdcf21612dbcf95: Status 404 returned error can't find the container with id 61542d519385aadf0f933d58b664e7ea6fbc440e55cf9841ecdcf21612dbcf95 Apr 16 14:01:16.989116 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:16.989082 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94","Type":"ContainerStarted","Data":"61542d519385aadf0f933d58b664e7ea6fbc440e55cf9841ecdcf21612dbcf95"} Apr 16 14:01:17.116766 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.116736 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7cbfb44866-n8qrk"] Apr 16 14:01:17.119279 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.119244 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.126469 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.125827 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 14:01:17.126469 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.125853 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 14:01:17.126469 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.125837 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 14:01:17.127385 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.126892 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-9abe0h3gk8hn3\"" Apr 16 14:01:17.127385 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.126939 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 14:01:17.127385 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.127244 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-z52z5\"" Apr 16 14:01:17.131148 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.127871 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 14:01:17.135839 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.135817 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.135948 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.135913 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-grpc-tls\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.135948 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.135943 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-thanos-querier-tls\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.136054 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.135972 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d700b8d8-e872-461f-98da-59ab8f1ffa2c-metrics-client-ca\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.136054 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.136028 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.136133 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.136085 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-659jt\" (UniqueName: \"kubernetes.io/projected/d700b8d8-e872-461f-98da-59ab8f1ffa2c-kube-api-access-659jt\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.136133 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.136120 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.136212 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.136148 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.148416 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.148374 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7cbfb44866-n8qrk"] Apr 16 14:01:17.236783 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.236743 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-grpc-tls\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.237222 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.236797 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-thanos-querier-tls\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.237222 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.236828 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d700b8d8-e872-461f-98da-59ab8f1ffa2c-metrics-client-ca\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.237222 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.236864 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.237222 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.236909 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-659jt\" (UniqueName: \"kubernetes.io/projected/d700b8d8-e872-461f-98da-59ab8f1ffa2c-kube-api-access-659jt\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.237222 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.236954 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.237222 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.236988 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.237222 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.237035 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.237768 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.237715 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d700b8d8-e872-461f-98da-59ab8f1ffa2c-metrics-client-ca\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.240170 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.240088 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-grpc-tls\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.240660 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.240636 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.240806 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.240779 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.240978 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.240947 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-thanos-querier-tls\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.241103 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.241011 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.241543 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.241520 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d700b8d8-e872-461f-98da-59ab8f1ffa2c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.252921 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.252888 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-659jt\" (UniqueName: \"kubernetes.io/projected/d700b8d8-e872-461f-98da-59ab8f1ffa2c-kube-api-access-659jt\") pod \"thanos-querier-7cbfb44866-n8qrk\" (UID: \"d700b8d8-e872-461f-98da-59ab8f1ffa2c\") " pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.434468 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.434439 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:17.565075 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.565033 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7cbfb44866-n8qrk"] Apr 16 14:01:17.570905 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:01:17.570849 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd700b8d8_e872_461f_98da_59ab8f1ffa2c.slice/crio-47cda8bbd12c80c3f73b5cb4ca2fde3f6b6d8daa84734e2e73f12800ac67e527 WatchSource:0}: Error finding container 47cda8bbd12c80c3f73b5cb4ca2fde3f6b6d8daa84734e2e73f12800ac67e527: Status 404 returned error can't find the container with id 47cda8bbd12c80c3f73b5cb4ca2fde3f6b6d8daa84734e2e73f12800ac67e527 Apr 16 14:01:17.962583 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.962554 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xstwc" Apr 16 14:01:17.993689 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.993654 2569 generic.go:358] "Generic (PLEG): container finished" podID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerID="e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f" exitCode=0 Apr 16 14:01:17.993903 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.993752 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94","Type":"ContainerDied","Data":"e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f"} Apr 16 14:01:17.996138 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.996110 2569 generic.go:358] "Generic (PLEG): container finished" podID="ea67638c-bae3-407e-bb50-aeeae5ea8f7d" containerID="14353d4d475b2634fd50c15441aaf80aec29fc0b89aee0e7714ac8f59e6ae524" exitCode=0 Apr 16 14:01:17.996243 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.996226 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w7qjp" event={"ID":"ea67638c-bae3-407e-bb50-aeeae5ea8f7d","Type":"ContainerDied","Data":"14353d4d475b2634fd50c15441aaf80aec29fc0b89aee0e7714ac8f59e6ae524"} Apr 16 14:01:17.998456 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:17.997530 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" event={"ID":"d700b8d8-e872-461f-98da-59ab8f1ffa2c","Type":"ContainerStarted","Data":"47cda8bbd12c80c3f73b5cb4ca2fde3f6b6d8daa84734e2e73f12800ac67e527"} Apr 16 14:01:18.789575 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:18.789539 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-99jzx"] Apr 16 14:01:18.791826 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:18.791804 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-99jzx" Apr 16 14:01:18.794329 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:18.794309 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-bvmlm\"" Apr 16 14:01:18.794707 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:18.794689 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 14:01:18.803978 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:18.803954 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-99jzx"] Apr 16 14:01:18.849428 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:18.849396 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1c788be0-8df6-44b5-9585-852e5bae9147-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-99jzx\" (UID: \"1c788be0-8df6-44b5-9585-852e5bae9147\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-99jzx" Apr 16 14:01:18.950442 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:18.950406 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1c788be0-8df6-44b5-9585-852e5bae9147-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-99jzx\" (UID: \"1c788be0-8df6-44b5-9585-852e5bae9147\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-99jzx" Apr 16 14:01:18.953243 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:18.953190 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1c788be0-8df6-44b5-9585-852e5bae9147-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-99jzx\" (UID: \"1c788be0-8df6-44b5-9585-852e5bae9147\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-99jzx" Apr 16 14:01:19.002904 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:19.002853 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w7qjp" event={"ID":"ea67638c-bae3-407e-bb50-aeeae5ea8f7d","Type":"ContainerStarted","Data":"a71471cb4611093cadaf67dffca387a448b0e189ac31d25258ef2909ebdfb4a6"} Apr 16 14:01:19.002904 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:19.002901 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w7qjp" event={"ID":"ea67638c-bae3-407e-bb50-aeeae5ea8f7d","Type":"ContainerStarted","Data":"38ee876e63430686d2f29ce2b857f99108aa83286a58c4e34131592db5c368b2"} Apr 16 14:01:19.021207 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:19.021128 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-w7qjp" podStartSLOduration=3.5358400100000003 podStartE2EDuration="5.021111216s" podCreationTimestamp="2026-04-16 14:01:14 +0000 UTC" firstStartedPulling="2026-04-16 14:01:15.861154243 +0000 UTC m=+110.869390477" lastFinishedPulling="2026-04-16 14:01:17.34642545 +0000 UTC m=+112.354661683" observedRunningTime="2026-04-16 14:01:19.02067449 +0000 UTC m=+114.028910744" watchObservedRunningTime="2026-04-16 14:01:19.021111216 +0000 UTC m=+114.029347473" Apr 16 14:01:19.103645 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:19.103567 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-99jzx" Apr 16 14:01:19.223181 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:19.223139 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-99jzx"] Apr 16 14:01:19.227006 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:01:19.226978 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c788be0_8df6_44b5_9585_852e5bae9147.slice/crio-589822c5413784b8ce678975256f12c9af1baf934e0b6413807efdc89b0438a7 WatchSource:0}: Error finding container 589822c5413784b8ce678975256f12c9af1baf934e0b6413807efdc89b0438a7: Status 404 returned error can't find the container with id 589822c5413784b8ce678975256f12c9af1baf934e0b6413807efdc89b0438a7 Apr 16 14:01:20.007594 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:20.007564 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" event={"ID":"d700b8d8-e872-461f-98da-59ab8f1ffa2c","Type":"ContainerStarted","Data":"02415b77fdba7e6a039fda44a39947dadc8790b6884ce795ce4f15c98f7eaf9d"} Apr 16 14:01:20.007971 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:20.007605 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" event={"ID":"d700b8d8-e872-461f-98da-59ab8f1ffa2c","Type":"ContainerStarted","Data":"7f8aab300121776a1cad79f4d9a821809734fc236b555717b7cc9305c585acae"} Apr 16 14:01:20.008623 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:20.008596 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-99jzx" event={"ID":"1c788be0-8df6-44b5-9585-852e5bae9147","Type":"ContainerStarted","Data":"589822c5413784b8ce678975256f12c9af1baf934e0b6413807efdc89b0438a7"} Apr 16 14:01:20.042931 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:20.042890 2569 patch_prober.go:28] interesting pod/image-registry-7df7868c7f-q22rr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:01:20.043112 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:20.042956 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" podUID="6ba3df13-9e23-42a3-86a2-4929bfedb89a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:01:21.013706 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:21.013665 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" event={"ID":"d700b8d8-e872-461f-98da-59ab8f1ffa2c","Type":"ContainerStarted","Data":"b623bb3879f6f5ea988857e3c5c86cc6316845dad9919c2b61ce679056cba4a4"} Apr 16 14:01:21.928630 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:21.928599 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7df7868c7f-q22rr" Apr 16 14:01:22.018873 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.018835 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" event={"ID":"d700b8d8-e872-461f-98da-59ab8f1ffa2c","Type":"ContainerStarted","Data":"53b1e1b2f0a93326098da98309aaf6d45cb4335d407d042e40ed1029b34fbf2a"} Apr 16 14:01:22.018873 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.018878 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" event={"ID":"d700b8d8-e872-461f-98da-59ab8f1ffa2c","Type":"ContainerStarted","Data":"86e957a4178d0f78918545872012370bc33f63b0091d2afecd5c5716d7fb78fe"} Apr 16 14:01:22.019381 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.018888 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" event={"ID":"d700b8d8-e872-461f-98da-59ab8f1ffa2c","Type":"ContainerStarted","Data":"199c23b4d08e1212e564d1a1b52cad3e58f3777ae0c0dcc4015806814eda53cc"} Apr 16 14:01:22.019381 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.019005 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:22.021515 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.021488 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94","Type":"ContainerStarted","Data":"d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef"} Apr 16 14:01:22.021648 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.021519 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94","Type":"ContainerStarted","Data":"7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d"} Apr 16 14:01:22.021648 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.021532 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94","Type":"ContainerStarted","Data":"6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705"} Apr 16 14:01:22.021648 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.021545 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94","Type":"ContainerStarted","Data":"11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62"} Apr 16 14:01:22.021648 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.021556 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94","Type":"ContainerStarted","Data":"d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651"} Apr 16 14:01:22.021648 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.021568 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94","Type":"ContainerStarted","Data":"de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3"} Apr 16 14:01:22.022991 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.022964 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-99jzx" event={"ID":"1c788be0-8df6-44b5-9585-852e5bae9147","Type":"ContainerStarted","Data":"94f2de9c378a07683d23b25623b4dcbe3b3aebd8f5717cd64f680b57f05193b3"} Apr 16 14:01:22.023170 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.023155 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-99jzx" Apr 16 14:01:22.027641 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.027622 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-99jzx" Apr 16 14:01:22.040991 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.040946 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" podStartSLOduration=1.240365844 podStartE2EDuration="5.040932819s" podCreationTimestamp="2026-04-16 14:01:17 +0000 UTC" firstStartedPulling="2026-04-16 14:01:17.572814063 +0000 UTC m=+112.581050296" lastFinishedPulling="2026-04-16 14:01:21.373381037 +0000 UTC m=+116.381617271" observedRunningTime="2026-04-16 14:01:22.040069966 +0000 UTC m=+117.048306220" watchObservedRunningTime="2026-04-16 14:01:22.040932819 +0000 UTC m=+117.049169112" Apr 16 14:01:22.068239 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.068187 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.9014437690000001 podStartE2EDuration="7.068171309s" podCreationTimestamp="2026-04-16 14:01:15 +0000 UTC" firstStartedPulling="2026-04-16 14:01:16.207229798 +0000 UTC m=+111.215466031" lastFinishedPulling="2026-04-16 14:01:21.373957338 +0000 UTC m=+116.382193571" observedRunningTime="2026-04-16 14:01:22.065529774 +0000 UTC m=+117.073766028" watchObservedRunningTime="2026-04-16 14:01:22.068171309 +0000 UTC m=+117.076407563" Apr 16 14:01:22.082452 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:22.082402 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-99jzx" podStartSLOduration=1.939512948 podStartE2EDuration="4.082388392s" podCreationTimestamp="2026-04-16 14:01:18 +0000 UTC" firstStartedPulling="2026-04-16 14:01:19.2289338 +0000 UTC m=+114.237170033" lastFinishedPulling="2026-04-16 14:01:21.371809233 +0000 UTC m=+116.380045477" observedRunningTime="2026-04-16 14:01:22.080817218 +0000 UTC m=+117.089053473" watchObservedRunningTime="2026-04-16 14:01:22.082388392 +0000 UTC m=+117.090624647" Apr 16 14:01:26.021470 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.021428 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56d8cccb6c-h957m"] Apr 16 14:01:26.024547 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.024523 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.027378 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.027347 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:01:26.028596 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.028563 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:01:26.028739 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.028593 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:01:26.028817 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.028648 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:01:26.028817 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.028648 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:01:26.028952 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.028648 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-9gz7z\"" Apr 16 14:01:26.028952 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.028650 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:01:26.028952 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.028670 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:01:26.035063 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.035039 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56d8cccb6c-h957m"] Apr 16 14:01:26.121448 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.121415 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-service-ca\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.121616 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.121462 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-oauth-config\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.121616 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.121535 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-oauth-serving-cert\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.121616 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.121582 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-config\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.121715 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.121617 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t99jz\" (UniqueName: \"kubernetes.io/projected/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-kube-api-access-t99jz\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.121715 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.121688 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-serving-cert\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.222621 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.222587 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-oauth-serving-cert\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.222771 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.222624 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-config\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.222771 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.222662 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t99jz\" (UniqueName: \"kubernetes.io/projected/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-kube-api-access-t99jz\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.222771 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.222691 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-serving-cert\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.222771 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.222715 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-service-ca\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.222771 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.222746 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-oauth-config\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.223394 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.223367 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-oauth-serving-cert\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.223496 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.223474 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-service-ca\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.223553 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.223474 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-config\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.225270 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.225234 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-oauth-config\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.225350 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.225319 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-serving-cert\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.231301 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.231276 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t99jz\" (UniqueName: \"kubernetes.io/projected/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-kube-api-access-t99jz\") pod \"console-56d8cccb6c-h957m\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.334399 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.334293 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:26.452111 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:26.452077 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56d8cccb6c-h957m"] Apr 16 14:01:26.455156 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:01:26.455114 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24a15385_eb1d_4fe7_9bdc_1b5f7f90b62d.slice/crio-d20847912e3724fe9a030e266d849bb19554c67ec50d9fe8dfca605f36a18a6a WatchSource:0}: Error finding container d20847912e3724fe9a030e266d849bb19554c67ec50d9fe8dfca605f36a18a6a: Status 404 returned error can't find the container with id d20847912e3724fe9a030e266d849bb19554c67ec50d9fe8dfca605f36a18a6a Apr 16 14:01:27.038320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:27.038282 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d8cccb6c-h957m" event={"ID":"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d","Type":"ContainerStarted","Data":"d20847912e3724fe9a030e266d849bb19554c67ec50d9fe8dfca605f36a18a6a"} Apr 16 14:01:28.034346 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:28.034307 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7cbfb44866-n8qrk" Apr 16 14:01:30.048218 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:30.048180 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d8cccb6c-h957m" event={"ID":"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d","Type":"ContainerStarted","Data":"afb089eebd501469c1e32b4fc7b4766ad6d983ee846ec96d52e2641df0d3ac4a"} Apr 16 14:01:30.070722 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:30.070666 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56d8cccb6c-h957m" podStartSLOduration=1.135397758 podStartE2EDuration="4.070650675s" podCreationTimestamp="2026-04-16 14:01:26 +0000 UTC" firstStartedPulling="2026-04-16 14:01:26.457165887 +0000 UTC m=+121.465402120" lastFinishedPulling="2026-04-16 14:01:29.392418804 +0000 UTC m=+124.400655037" observedRunningTime="2026-04-16 14:01:30.069983933 +0000 UTC m=+125.078220188" watchObservedRunningTime="2026-04-16 14:01:30.070650675 +0000 UTC m=+125.078886930" Apr 16 14:01:35.312124 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.312090 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c977d945c-sppg4"] Apr 16 14:01:35.315558 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.315541 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.323853 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.323824 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:01:35.329553 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.329525 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c977d945c-sppg4"] Apr 16 14:01:35.407063 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.407028 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctq9n\" (UniqueName: \"kubernetes.io/projected/43157412-d832-43d6-bb75-2e48c9408000-kube-api-access-ctq9n\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.407063 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.407068 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-console-config\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.407320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.407098 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs\") pod \"network-metrics-daemon-mkz26\" (UID: \"c449dabf-b9f5-4136-b598-074040f02629\") " pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 14:01:35.407320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.407172 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43157412-d832-43d6-bb75-2e48c9408000-console-serving-cert\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.407320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.407227 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-service-ca\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.407320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.407276 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43157412-d832-43d6-bb75-2e48c9408000-console-oauth-config\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.407320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.407293 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-oauth-serving-cert\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.407487 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.407325 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-trusted-ca-bundle\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.409393 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.409373 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c449dabf-b9f5-4136-b598-074040f02629-metrics-certs\") pod \"network-metrics-daemon-mkz26\" (UID: \"c449dabf-b9f5-4136-b598-074040f02629\") " pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 14:01:35.508442 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.508394 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctq9n\" (UniqueName: \"kubernetes.io/projected/43157412-d832-43d6-bb75-2e48c9408000-kube-api-access-ctq9n\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.508442 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.508451 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-console-config\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.508706 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.508477 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43157412-d832-43d6-bb75-2e48c9408000-console-serving-cert\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.508706 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.508503 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-service-ca\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.508706 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.508534 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43157412-d832-43d6-bb75-2e48c9408000-console-oauth-config\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.508706 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.508549 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-oauth-serving-cert\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.508706 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.508582 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-trusted-ca-bundle\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.509306 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.509272 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-oauth-serving-cert\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.509430 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.509330 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-console-config\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.509481 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.509468 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-service-ca\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.509522 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.509469 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-trusted-ca-bundle\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.511264 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.511230 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43157412-d832-43d6-bb75-2e48c9408000-console-serving-cert\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.511338 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.511263 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43157412-d832-43d6-bb75-2e48c9408000-console-oauth-config\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.513939 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.513925 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p5bbb\"" Apr 16 14:01:35.517563 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.517537 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctq9n\" (UniqueName: \"kubernetes.io/projected/43157412-d832-43d6-bb75-2e48c9408000-kube-api-access-ctq9n\") pod \"console-5c977d945c-sppg4\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.522130 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.522112 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mkz26" Apr 16 14:01:35.624758 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.624721 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:35.673822 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.673796 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mkz26"] Apr 16 14:01:35.677811 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:01:35.677763 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc449dabf_b9f5_4136_b598_074040f02629.slice/crio-789f49e7c1593f6aa0bf7e1408cfc30a6cec8a1366b49ea9ffab5c2ed0ddf9bb WatchSource:0}: Error finding container 789f49e7c1593f6aa0bf7e1408cfc30a6cec8a1366b49ea9ffab5c2ed0ddf9bb: Status 404 returned error can't find the container with id 789f49e7c1593f6aa0bf7e1408cfc30a6cec8a1366b49ea9ffab5c2ed0ddf9bb Apr 16 14:01:35.755960 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:35.755813 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c977d945c-sppg4"] Apr 16 14:01:35.758600 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:01:35.758574 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43157412_d832_43d6_bb75_2e48c9408000.slice/crio-8e19c50a231e05e4dfa2b0736b4aefc98b987d21dbbef0a072f35c8f7300977d WatchSource:0}: Error finding container 8e19c50a231e05e4dfa2b0736b4aefc98b987d21dbbef0a072f35c8f7300977d: Status 404 returned error can't find the container with id 8e19c50a231e05e4dfa2b0736b4aefc98b987d21dbbef0a072f35c8f7300977d Apr 16 14:01:36.068455 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:36.068415 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c977d945c-sppg4" event={"ID":"43157412-d832-43d6-bb75-2e48c9408000","Type":"ContainerStarted","Data":"5aabb049ec43774485e16c43dda465a8b8a18c21a723d2ff406cef42fadd6f08"} Apr 16 14:01:36.068455 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:36.068458 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c977d945c-sppg4" event={"ID":"43157412-d832-43d6-bb75-2e48c9408000","Type":"ContainerStarted","Data":"8e19c50a231e05e4dfa2b0736b4aefc98b987d21dbbef0a072f35c8f7300977d"} Apr 16 14:01:36.070145 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:36.070114 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mkz26" event={"ID":"c449dabf-b9f5-4136-b598-074040f02629","Type":"ContainerStarted","Data":"789f49e7c1593f6aa0bf7e1408cfc30a6cec8a1366b49ea9ffab5c2ed0ddf9bb"} Apr 16 14:01:36.090226 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:36.090091 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c977d945c-sppg4" podStartSLOduration=1.090071642 podStartE2EDuration="1.090071642s" podCreationTimestamp="2026-04-16 14:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:36.089469461 +0000 UTC m=+131.097705719" watchObservedRunningTime="2026-04-16 14:01:36.090071642 +0000 UTC m=+131.098307898" Apr 16 14:01:36.335539 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:36.335499 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:36.335921 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:36.335555 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:36.340708 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:36.340633 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:37.074555 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:37.074516 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mkz26" event={"ID":"c449dabf-b9f5-4136-b598-074040f02629","Type":"ContainerStarted","Data":"1faad221536c73958fc116eb548e1f0e64cc55d7cdae6bda3eb9292ab27a9890"} Apr 16 14:01:37.074555 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:37.074561 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mkz26" event={"ID":"c449dabf-b9f5-4136-b598-074040f02629","Type":"ContainerStarted","Data":"969f694910109bedea60d9b674b566eda2fa5e0d83c14be2ce07c4c211ed3098"} Apr 16 14:01:37.078443 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:37.078419 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:01:37.097577 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:37.097525 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mkz26" podStartSLOduration=131.165753834 podStartE2EDuration="2m12.097509756s" podCreationTimestamp="2026-04-16 13:59:25 +0000 UTC" firstStartedPulling="2026-04-16 14:01:35.679877228 +0000 UTC m=+130.688113475" lastFinishedPulling="2026-04-16 14:01:36.611633164 +0000 UTC m=+131.619869397" observedRunningTime="2026-04-16 14:01:37.097114473 +0000 UTC m=+132.105350728" watchObservedRunningTime="2026-04-16 14:01:37.097509756 +0000 UTC m=+132.105746012" Apr 16 14:01:42.089869 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:42.089778 2569 generic.go:358] "Generic (PLEG): container finished" podID="21db3010-f35e-486a-9584-0dc09d164c21" containerID="4da0eeccb0b84905869c94b89ae004c3849c283827805e49ef7ae73e3fb20af2" exitCode=0 Apr 16 14:01:42.089869 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:42.089835 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" event={"ID":"21db3010-f35e-486a-9584-0dc09d164c21","Type":"ContainerDied","Data":"4da0eeccb0b84905869c94b89ae004c3849c283827805e49ef7ae73e3fb20af2"} Apr 16 14:01:42.090336 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:42.090196 2569 scope.go:117] "RemoveContainer" containerID="4da0eeccb0b84905869c94b89ae004c3849c283827805e49ef7ae73e3fb20af2" Apr 16 14:01:43.094331 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:43.094293 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-xztn9" event={"ID":"21db3010-f35e-486a-9584-0dc09d164c21","Type":"ContainerStarted","Data":"8e7e78d563ae38bdd5671bc475d04850879265ad84ba79c3ac2e54441aab8440"} Apr 16 14:01:45.625579 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:45.625541 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:45.625579 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:45.625585 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:45.630408 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:45.630383 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:46.106853 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:46.106822 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:01:46.160977 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:01:46.160943 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56d8cccb6c-h957m"] Apr 16 14:02:11.185472 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.185432 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56d8cccb6c-h957m" podUID="24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d" containerName="console" containerID="cri-o://afb089eebd501469c1e32b4fc7b4766ad6d983ee846ec96d52e2641df0d3ac4a" gracePeriod=15 Apr 16 14:02:11.427781 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.427758 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56d8cccb6c-h957m_24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d/console/0.log" Apr 16 14:02:11.427928 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.427821 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:02:11.524134 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.524047 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-oauth-config\") pod \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " Apr 16 14:02:11.524134 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.524092 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-service-ca\") pod \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " Apr 16 14:02:11.524134 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.524123 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-config\") pod \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " Apr 16 14:02:11.524409 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.524304 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-oauth-serving-cert\") pod \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " Apr 16 14:02:11.524409 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.524379 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t99jz\" (UniqueName: \"kubernetes.io/projected/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-kube-api-access-t99jz\") pod \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " Apr 16 14:02:11.524409 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.524402 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-serving-cert\") pod \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\" (UID: \"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d\") " Apr 16 14:02:11.524642 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.524539 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-config" (OuterVolumeSpecName: "console-config") pod "24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d" (UID: "24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:11.524642 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.524545 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-service-ca" (OuterVolumeSpecName: "service-ca") pod "24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d" (UID: "24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:11.524642 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.524637 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-config\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:11.524765 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.524650 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-service-ca\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:11.524765 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.524681 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d" (UID: "24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:11.526429 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.526407 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d" (UID: "24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:11.526497 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.526441 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d" (UID: "24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:11.526534 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.526517 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-kube-api-access-t99jz" (OuterVolumeSpecName: "kube-api-access-t99jz") pod "24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d" (UID: "24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d"). InnerVolumeSpecName "kube-api-access-t99jz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:11.625403 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.625368 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-oauth-config\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:11.625403 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.625397 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-oauth-serving-cert\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:11.625403 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.625408 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t99jz\" (UniqueName: \"kubernetes.io/projected/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-kube-api-access-t99jz\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:11.625615 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:11.625417 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d-console-serving-cert\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:12.182365 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:12.182333 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56d8cccb6c-h957m_24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d/console/0.log" Apr 16 14:02:12.182564 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:12.182380 2569 generic.go:358] "Generic (PLEG): container finished" podID="24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d" containerID="afb089eebd501469c1e32b4fc7b4766ad6d983ee846ec96d52e2641df0d3ac4a" exitCode=2 Apr 16 14:02:12.182564 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:12.182439 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d8cccb6c-h957m" event={"ID":"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d","Type":"ContainerDied","Data":"afb089eebd501469c1e32b4fc7b4766ad6d983ee846ec96d52e2641df0d3ac4a"} Apr 16 14:02:12.182564 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:12.182472 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d8cccb6c-h957m" event={"ID":"24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d","Type":"ContainerDied","Data":"d20847912e3724fe9a030e266d849bb19554c67ec50d9fe8dfca605f36a18a6a"} Apr 16 14:02:12.182564 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:12.182484 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d8cccb6c-h957m" Apr 16 14:02:12.182727 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:12.182491 2569 scope.go:117] "RemoveContainer" containerID="afb089eebd501469c1e32b4fc7b4766ad6d983ee846ec96d52e2641df0d3ac4a" Apr 16 14:02:12.190601 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:12.190415 2569 scope.go:117] "RemoveContainer" containerID="afb089eebd501469c1e32b4fc7b4766ad6d983ee846ec96d52e2641df0d3ac4a" Apr 16 14:02:12.190851 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:02:12.190678 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afb089eebd501469c1e32b4fc7b4766ad6d983ee846ec96d52e2641df0d3ac4a\": container with ID starting with afb089eebd501469c1e32b4fc7b4766ad6d983ee846ec96d52e2641df0d3ac4a not found: ID does not exist" containerID="afb089eebd501469c1e32b4fc7b4766ad6d983ee846ec96d52e2641df0d3ac4a" Apr 16 14:02:12.190851 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:12.190705 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb089eebd501469c1e32b4fc7b4766ad6d983ee846ec96d52e2641df0d3ac4a"} err="failed to get container status \"afb089eebd501469c1e32b4fc7b4766ad6d983ee846ec96d52e2641df0d3ac4a\": rpc error: code = NotFound desc = could not find container \"afb089eebd501469c1e32b4fc7b4766ad6d983ee846ec96d52e2641df0d3ac4a\": container with ID starting with afb089eebd501469c1e32b4fc7b4766ad6d983ee846ec96d52e2641df0d3ac4a not found: ID does not exist" Apr 16 14:02:12.204827 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:12.204798 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56d8cccb6c-h957m"] Apr 16 14:02:12.209628 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:12.209602 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56d8cccb6c-h957m"] Apr 16 14:02:13.603406 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:13.603374 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d" path="/var/lib/kubelet/pods/24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d/volumes" Apr 16 14:02:34.573816 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:34.573779 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:02:34.574333 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:34.574287 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="alertmanager" containerID="cri-o://de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3" gracePeriod=120 Apr 16 14:02:34.574420 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:34.574332 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="kube-rbac-proxy-metric" containerID="cri-o://7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d" gracePeriod=120 Apr 16 14:02:34.574420 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:34.574346 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="prom-label-proxy" containerID="cri-o://d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef" gracePeriod=120 Apr 16 14:02:34.574420 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:34.574366 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="kube-rbac-proxy-web" containerID="cri-o://11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62" gracePeriod=120 Apr 16 14:02:34.574420 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:34.574387 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="config-reloader" containerID="cri-o://d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651" gracePeriod=120 Apr 16 14:02:34.574575 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:34.574397 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="kube-rbac-proxy" containerID="cri-o://6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705" gracePeriod=120 Apr 16 14:02:35.252582 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.252553 2569 generic.go:358] "Generic (PLEG): container finished" podID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerID="d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef" exitCode=0 Apr 16 14:02:35.252582 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.252577 2569 generic.go:358] "Generic (PLEG): container finished" podID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerID="6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705" exitCode=0 Apr 16 14:02:35.252582 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.252583 2569 generic.go:358] "Generic (PLEG): container finished" podID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerID="d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651" exitCode=0 Apr 16 14:02:35.252582 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.252589 2569 generic.go:358] "Generic (PLEG): container finished" podID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerID="de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3" exitCode=0 Apr 16 14:02:35.252844 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.252608 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94","Type":"ContainerDied","Data":"d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef"} Apr 16 14:02:35.252844 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.252631 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94","Type":"ContainerDied","Data":"6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705"} Apr 16 14:02:35.252844 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.252640 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94","Type":"ContainerDied","Data":"d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651"} Apr 16 14:02:35.252844 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.252650 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94","Type":"ContainerDied","Data":"de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3"} Apr 16 14:02:35.818805 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.818780 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:35.926520 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.926489 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-web-config\") pod \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " Apr 16 14:02:35.926704 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.926554 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-cluster-tls-config\") pod \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " Apr 16 14:02:35.926704 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.926573 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-main-tls\") pod \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " Apr 16 14:02:35.926704 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.926590 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-config-volume\") pod \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " Apr 16 14:02:35.926704 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.926621 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy-web\") pod \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " Apr 16 14:02:35.926704 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.926642 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-metrics-client-ca\") pod \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " Apr 16 14:02:35.926704 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.926666 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-alertmanager-main-db\") pod \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " Apr 16 14:02:35.926704 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.926690 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy\") pod \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " Apr 16 14:02:35.927058 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.926733 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf9hd\" (UniqueName: \"kubernetes.io/projected/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-kube-api-access-rf9hd\") pod \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " Apr 16 14:02:35.927058 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.926769 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-config-out\") pod \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " Apr 16 14:02:35.927058 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.926815 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-tls-assets\") pod \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " Apr 16 14:02:35.927058 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.926856 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy-metric\") pod \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " Apr 16 14:02:35.927058 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.926887 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-alertmanager-trusted-ca-bundle\") pod \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\" (UID: \"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94\") " Apr 16 14:02:35.927349 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.927060 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" (UID: "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:35.927349 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.927194 2569 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-metrics-client-ca\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:35.927499 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.927474 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" (UID: "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:35.927852 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.927795 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" (UID: "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:02:35.930200 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.930159 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-config-volume" (OuterVolumeSpecName: "config-volume") pod "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" (UID: "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:35.930580 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.930423 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" (UID: "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:35.930580 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.930553 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" (UID: "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:35.930732 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.930638 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" (UID: "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:35.931212 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.931178 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-config-out" (OuterVolumeSpecName: "config-out") pod "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" (UID: "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:02:35.931212 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.931201 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" (UID: "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:35.931806 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.931778 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" (UID: "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:35.931806 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.931781 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-kube-api-access-rf9hd" (OuterVolumeSpecName: "kube-api-access-rf9hd") pod "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" (UID: "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94"). InnerVolumeSpecName "kube-api-access-rf9hd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:35.934709 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.934678 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" (UID: "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:35.941855 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:35.941830 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-web-config" (OuterVolumeSpecName: "web-config") pod "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" (UID: "6c8e5ea7-027c-46cd-a8e8-f50fb343bc94"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:36.027661 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.027618 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-tls-assets\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.027661 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.027653 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.027661 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.027666 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.027885 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.027678 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-web-config\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.027885 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.027687 2569 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-cluster-tls-config\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.027885 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.027697 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-main-tls\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.027885 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.027706 2569 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-config-volume\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.027885 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.027714 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.027885 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.027724 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-alertmanager-main-db\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.027885 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.027733 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.027885 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.027742 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rf9hd\" (UniqueName: \"kubernetes.io/projected/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-kube-api-access-rf9hd\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.027885 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.027750 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94-config-out\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.258389 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.258300 2569 generic.go:358] "Generic (PLEG): container finished" podID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerID="7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d" exitCode=0 Apr 16 14:02:36.258389 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.258329 2569 generic.go:358] "Generic (PLEG): container finished" podID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerID="11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62" exitCode=0 Apr 16 14:02:36.258389 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.258359 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94","Type":"ContainerDied","Data":"7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d"} Apr 16 14:02:36.258629 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.258395 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94","Type":"ContainerDied","Data":"11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62"} Apr 16 14:02:36.258629 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.258408 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6c8e5ea7-027c-46cd-a8e8-f50fb343bc94","Type":"ContainerDied","Data":"61542d519385aadf0f933d58b664e7ea6fbc440e55cf9841ecdcf21612dbcf95"} Apr 16 14:02:36.258629 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.258425 2569 scope.go:117] "RemoveContainer" containerID="d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef" Apr 16 14:02:36.258629 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.258425 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.265876 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.265855 2569 scope.go:117] "RemoveContainer" containerID="7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d" Apr 16 14:02:36.275514 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.275492 2569 scope.go:117] "RemoveContainer" containerID="6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705" Apr 16 14:02:36.282295 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.282233 2569 scope.go:117] "RemoveContainer" containerID="11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62" Apr 16 14:02:36.283329 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.283296 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:02:36.287768 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.287748 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:02:36.289916 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.289875 2569 scope.go:117] "RemoveContainer" containerID="d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651" Apr 16 14:02:36.296230 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.296211 2569 scope.go:117] "RemoveContainer" containerID="de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3" Apr 16 14:02:36.302840 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.302821 2569 scope.go:117] "RemoveContainer" containerID="e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f" Apr 16 14:02:36.309808 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.309790 2569 scope.go:117] "RemoveContainer" containerID="d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef" Apr 16 14:02:36.310060 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:02:36.310040 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef\": container with ID starting with d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef not found: ID does not exist" containerID="d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef" Apr 16 14:02:36.310104 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.310068 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef"} err="failed to get container status \"d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef\": rpc error: code = NotFound desc = could not find container \"d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef\": container with ID starting with d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef not found: ID does not exist" Apr 16 14:02:36.310104 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.310088 2569 scope.go:117] "RemoveContainer" containerID="7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d" Apr 16 14:02:36.310346 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:02:36.310325 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d\": container with ID starting with 7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d not found: ID does not exist" containerID="7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d" Apr 16 14:02:36.310400 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.310355 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d"} err="failed to get container status \"7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d\": rpc error: code = NotFound desc = could not find container \"7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d\": container with ID starting with 7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d not found: ID does not exist" Apr 16 14:02:36.310400 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.310375 2569 scope.go:117] "RemoveContainer" containerID="6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705" Apr 16 14:02:36.310614 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:02:36.310595 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705\": container with ID starting with 6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705 not found: ID does not exist" containerID="6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705" Apr 16 14:02:36.310678 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.310624 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705"} err="failed to get container status \"6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705\": rpc error: code = NotFound desc = could not find container \"6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705\": container with ID starting with 6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705 not found: ID does not exist" Apr 16 14:02:36.310678 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.310647 2569 scope.go:117] "RemoveContainer" containerID="11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62" Apr 16 14:02:36.310874 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:02:36.310858 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62\": container with ID starting with 11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62 not found: ID does not exist" containerID="11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62" Apr 16 14:02:36.310917 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.310878 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62"} err="failed to get container status \"11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62\": rpc error: code = NotFound desc = could not find container \"11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62\": container with ID starting with 11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62 not found: ID does not exist" Apr 16 14:02:36.310917 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.310894 2569 scope.go:117] "RemoveContainer" containerID="d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651" Apr 16 14:02:36.311135 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:02:36.311119 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651\": container with ID starting with d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651 not found: ID does not exist" containerID="d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651" Apr 16 14:02:36.311190 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.311140 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651"} err="failed to get container status \"d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651\": rpc error: code = NotFound desc = could not find container \"d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651\": container with ID starting with d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651 not found: ID does not exist" Apr 16 14:02:36.311190 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.311153 2569 scope.go:117] "RemoveContainer" containerID="de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3" Apr 16 14:02:36.311393 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:02:36.311377 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3\": container with ID starting with de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3 not found: ID does not exist" containerID="de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3" Apr 16 14:02:36.311443 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.311398 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3"} err="failed to get container status \"de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3\": rpc error: code = NotFound desc = could not find container \"de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3\": container with ID starting with de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3 not found: ID does not exist" Apr 16 14:02:36.311443 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.311412 2569 scope.go:117] "RemoveContainer" containerID="e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f" Apr 16 14:02:36.311639 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:02:36.311623 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f\": container with ID starting with e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f not found: ID does not exist" containerID="e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f" Apr 16 14:02:36.311681 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.311645 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f"} err="failed to get container status \"e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f\": rpc error: code = NotFound desc = could not find container \"e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f\": container with ID starting with e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f not found: ID does not exist" Apr 16 14:02:36.311681 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.311659 2569 scope.go:117] "RemoveContainer" containerID="d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef" Apr 16 14:02:36.311861 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.311844 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef"} err="failed to get container status \"d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef\": rpc error: code = NotFound desc = could not find container \"d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef\": container with ID starting with d00bd80a269b6773e4ce27ad3e768b545246ecb28587426334b0e653ef77b1ef not found: ID does not exist" Apr 16 14:02:36.311924 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.311862 2569 scope.go:117] "RemoveContainer" containerID="7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d" Apr 16 14:02:36.312060 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.312044 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d"} err="failed to get container status \"7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d\": rpc error: code = NotFound desc = could not find container \"7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d\": container with ID starting with 7df08d993274ab62d57a3142769c119e981589f73d585e491cd1828b1c3a6e0d not found: ID does not exist" Apr 16 14:02:36.312105 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.312060 2569 scope.go:117] "RemoveContainer" containerID="6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705" Apr 16 14:02:36.312246 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.312222 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705"} err="failed to get container status \"6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705\": rpc error: code = NotFound desc = could not find container \"6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705\": container with ID starting with 6169d38eb346d3b52b4004ea728a7e63e5e7a37bd8e6ecab37dfbbc5055ba705 not found: ID does not exist" Apr 16 14:02:36.312246 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.312240 2569 scope.go:117] "RemoveContainer" containerID="11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62" Apr 16 14:02:36.312489 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.312462 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62"} err="failed to get container status \"11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62\": rpc error: code = NotFound desc = could not find container \"11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62\": container with ID starting with 11f9ceb6c882d687e635521e5efcb20c933c61e5cdaffcf050c1c8e7ef091d62 not found: ID does not exist" Apr 16 14:02:36.312561 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.312490 2569 scope.go:117] "RemoveContainer" containerID="d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651" Apr 16 14:02:36.312714 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.312695 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651"} err="failed to get container status \"d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651\": rpc error: code = NotFound desc = could not find container \"d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651\": container with ID starting with d9c57562c0db43115c47deb1bd9d3481d53f39f294e8b14631191640e4b31651 not found: ID does not exist" Apr 16 14:02:36.312758 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.312714 2569 scope.go:117] "RemoveContainer" containerID="de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3" Apr 16 14:02:36.312907 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.312891 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3"} err="failed to get container status \"de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3\": rpc error: code = NotFound desc = could not find container \"de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3\": container with ID starting with de949efb4e67a5d09e135e24380282c4602adc3f4128244a79cbf7f91d9bc1e3 not found: ID does not exist" Apr 16 14:02:36.312950 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.312908 2569 scope.go:117] "RemoveContainer" containerID="e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f" Apr 16 14:02:36.313106 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.313090 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f"} err="failed to get container status \"e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f\": rpc error: code = NotFound desc = could not find container \"e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f\": container with ID starting with e841c6df54d6f87669382972c4940eb15cc90e7e87bf2f150dcab8cdddc7957f not found: ID does not exist" Apr 16 14:02:36.316037 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316014 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:02:36.316395 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316379 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="config-reloader" Apr 16 14:02:36.316480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316397 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="config-reloader" Apr 16 14:02:36.316480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316410 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="init-config-reloader" Apr 16 14:02:36.316480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316419 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="init-config-reloader" Apr 16 14:02:36.316480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316435 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="kube-rbac-proxy" Apr 16 14:02:36.316480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316445 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="kube-rbac-proxy" Apr 16 14:02:36.316480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316462 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="alertmanager" Apr 16 14:02:36.316480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316470 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="alertmanager" Apr 16 14:02:36.316480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316479 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="kube-rbac-proxy-web" Apr 16 14:02:36.316841 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316487 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="kube-rbac-proxy-web" Apr 16 14:02:36.316841 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316502 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="prom-label-proxy" Apr 16 14:02:36.316841 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316511 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="prom-label-proxy" Apr 16 14:02:36.316841 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316527 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="kube-rbac-proxy-metric" Apr 16 14:02:36.316841 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316538 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="kube-rbac-proxy-metric" Apr 16 14:02:36.316841 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316547 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d" containerName="console" Apr 16 14:02:36.316841 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316556 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d" containerName="console" Apr 16 14:02:36.316841 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316625 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="config-reloader" Apr 16 14:02:36.316841 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316637 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="kube-rbac-proxy-web" Apr 16 14:02:36.316841 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316650 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="alertmanager" Apr 16 14:02:36.316841 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316660 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="kube-rbac-proxy-metric" Apr 16 14:02:36.316841 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316670 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="prom-label-proxy" Apr 16 14:02:36.316841 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316681 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" containerName="kube-rbac-proxy" Apr 16 14:02:36.316841 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.316691 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="24a15385-eb1d-4fe7-9bdc-1b5f7f90b62d" containerName="console" Apr 16 14:02:36.319907 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.319889 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.324377 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.324359 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 14:02:36.324460 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.324399 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-kp9lr\"" Apr 16 14:02:36.324657 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.324642 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 14:02:36.324722 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.324646 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 14:02:36.325070 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.325041 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 14:02:36.325192 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.325165 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 14:02:36.325285 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.325230 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 14:02:36.325358 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.325301 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 14:02:36.325569 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.325541 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 14:02:36.331108 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.331078 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 14:02:36.335497 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.335476 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:02:36.430111 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.430060 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/93d99919-c5f5-4373-a5c6-2329f656103c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.430111 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.430104 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93d99919-c5f5-4373-a5c6-2329f656103c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.430358 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.430129 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.430358 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.430154 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/93d99919-c5f5-4373-a5c6-2329f656103c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.430358 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.430184 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93d99919-c5f5-4373-a5c6-2329f656103c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.430358 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.430205 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-config-volume\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.430358 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.430275 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-web-config\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.430358 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.430324 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.430544 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.430360 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.430544 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.430422 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.430544 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.430440 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmm5b\" (UniqueName: \"kubernetes.io/projected/93d99919-c5f5-4373-a5c6-2329f656103c-kube-api-access-lmm5b\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.430544 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.430473 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.430544 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.430491 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/93d99919-c5f5-4373-a5c6-2329f656103c-config-out\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.531978 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.531884 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/93d99919-c5f5-4373-a5c6-2329f656103c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.531978 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.531925 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93d99919-c5f5-4373-a5c6-2329f656103c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.531978 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.531946 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.531978 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.531965 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/93d99919-c5f5-4373-a5c6-2329f656103c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.532320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.531986 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93d99919-c5f5-4373-a5c6-2329f656103c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.532320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.532028 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-config-volume\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.532320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.532059 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-web-config\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.532320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.532090 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.532320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.532132 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.532320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.532182 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.532320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.532213 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmm5b\" (UniqueName: \"kubernetes.io/projected/93d99919-c5f5-4373-a5c6-2329f656103c-kube-api-access-lmm5b\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.532320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.532283 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.532320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.532316 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/93d99919-c5f5-4373-a5c6-2329f656103c-config-out\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.532736 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.532491 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/93d99919-c5f5-4373-a5c6-2329f656103c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.532848 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.532822 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93d99919-c5f5-4373-a5c6-2329f656103c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.533284 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.532957 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93d99919-c5f5-4373-a5c6-2329f656103c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.535226 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.535190 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/93d99919-c5f5-4373-a5c6-2329f656103c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.535342 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.535195 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.535379 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.535359 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-web-config\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.535379 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.535371 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.535451 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.535420 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-config-volume\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.535668 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.535646 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/93d99919-c5f5-4373-a5c6-2329f656103c-config-out\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.535811 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.535792 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.535850 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.535819 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.537060 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.537044 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/93d99919-c5f5-4373-a5c6-2329f656103c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.542498 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.542472 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmm5b\" (UniqueName: \"kubernetes.io/projected/93d99919-c5f5-4373-a5c6-2329f656103c-kube-api-access-lmm5b\") pod \"alertmanager-main-0\" (UID: \"93d99919-c5f5-4373-a5c6-2329f656103c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.629163 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.629127 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:02:36.758482 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:36.758453 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:02:36.760950 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:02:36.760917 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93d99919_c5f5_4373_a5c6_2329f656103c.slice/crio-e14b294b4e3bbf5eb76894ab70cf77ce74aeff939a31a8b51c0014fd9d401734 WatchSource:0}: Error finding container e14b294b4e3bbf5eb76894ab70cf77ce74aeff939a31a8b51c0014fd9d401734: Status 404 returned error can't find the container with id e14b294b4e3bbf5eb76894ab70cf77ce74aeff939a31a8b51c0014fd9d401734 Apr 16 14:02:37.263599 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:37.263569 2569 generic.go:358] "Generic (PLEG): container finished" podID="93d99919-c5f5-4373-a5c6-2329f656103c" containerID="e9a6bb26e8ee03d4feb482ccb9657a0d84970bc5059e6087559ad76684243d56" exitCode=0 Apr 16 14:02:37.264005 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:37.263615 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"93d99919-c5f5-4373-a5c6-2329f656103c","Type":"ContainerDied","Data":"e9a6bb26e8ee03d4feb482ccb9657a0d84970bc5059e6087559ad76684243d56"} Apr 16 14:02:37.264005 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:37.263636 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"93d99919-c5f5-4373-a5c6-2329f656103c","Type":"ContainerStarted","Data":"e14b294b4e3bbf5eb76894ab70cf77ce74aeff939a31a8b51c0014fd9d401734"} Apr 16 14:02:37.603408 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:37.603381 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c8e5ea7-027c-46cd-a8e8-f50fb343bc94" path="/var/lib/kubelet/pods/6c8e5ea7-027c-46cd-a8e8-f50fb343bc94/volumes" Apr 16 14:02:38.270232 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:38.270198 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"93d99919-c5f5-4373-a5c6-2329f656103c","Type":"ContainerStarted","Data":"2af85be39fc385a0ba7ff04c04044dd961bb91926e1bd12acbeaeae59fb5e773"} Apr 16 14:02:38.270232 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:38.270236 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"93d99919-c5f5-4373-a5c6-2329f656103c","Type":"ContainerStarted","Data":"33be060b55066e3e8a0402cf940cc4627011c966e208e4206a832f3a7f4126dd"} Apr 16 14:02:38.270653 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:38.270245 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"93d99919-c5f5-4373-a5c6-2329f656103c","Type":"ContainerStarted","Data":"05ed7795c3349ed2b4c98d7ac892464bb99e7574d8abda672b9f5a36dcc4b2fe"} Apr 16 14:02:38.270653 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:38.270269 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"93d99919-c5f5-4373-a5c6-2329f656103c","Type":"ContainerStarted","Data":"d934fb8948ecc420feb8f23b476d8a16bb3c36cbf1d947823ba0ddab99ed9a8a"} Apr 16 14:02:38.270653 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:38.270277 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"93d99919-c5f5-4373-a5c6-2329f656103c","Type":"ContainerStarted","Data":"2adb552c3c607b686a34237ba9e2a61dfedb82307677bf01e30582fdc1a15859"} Apr 16 14:02:38.270653 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:38.270286 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"93d99919-c5f5-4373-a5c6-2329f656103c","Type":"ContainerStarted","Data":"16eea63980f81075852f879df6d014be7f78a14caa6514b89fd8ff71f2efaa0d"} Apr 16 14:02:38.308708 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:38.308658 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.308642406 podStartE2EDuration="2.308642406s" podCreationTimestamp="2026-04-16 14:02:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:02:38.303120907 +0000 UTC m=+193.311357162" watchObservedRunningTime="2026-04-16 14:02:38.308642406 +0000 UTC m=+193.316878663" Apr 16 14:02:40.326154 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.326112 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5bdd8569b5-pflnz"] Apr 16 14:02:40.328546 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.328526 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.347456 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.347422 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bdd8569b5-pflnz"] Apr 16 14:02:40.466849 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.466806 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-oauth-serving-cert\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.466849 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.466855 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-config\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.467052 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.466912 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-serving-cert\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.467052 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.466929 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-trusted-ca-bundle\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.467052 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.467032 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-service-ca\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.467149 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.467068 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kswf\" (UniqueName: \"kubernetes.io/projected/380fbc5f-c4cb-47f2-b320-1c537ea71943-kube-api-access-6kswf\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.467149 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.467107 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-oauth-config\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.568228 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.568187 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-service-ca\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.568228 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.568231 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kswf\" (UniqueName: \"kubernetes.io/projected/380fbc5f-c4cb-47f2-b320-1c537ea71943-kube-api-access-6kswf\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.568480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.568279 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-oauth-config\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.568480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.568306 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-oauth-serving-cert\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.568480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.568330 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-config\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.568480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.568371 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-serving-cert\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.568656 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.568506 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-trusted-ca-bundle\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.569085 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.569050 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-config\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.569192 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.569094 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-service-ca\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.569192 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.569154 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-oauth-serving-cert\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.569671 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.569645 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-trusted-ca-bundle\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.570733 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.570715 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-oauth-config\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.571043 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.571023 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-serving-cert\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.579519 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.579463 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kswf\" (UniqueName: \"kubernetes.io/projected/380fbc5f-c4cb-47f2-b320-1c537ea71943-kube-api-access-6kswf\") pod \"console-5bdd8569b5-pflnz\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.639806 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.639768 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:40.772010 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:40.771854 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bdd8569b5-pflnz"] Apr 16 14:02:40.774680 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:02:40.774654 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod380fbc5f_c4cb_47f2_b320_1c537ea71943.slice/crio-da617d1a76c6a68d89a2f74d2d2c77e3ab30ecb04cf075e76560ae20b8b170b3 WatchSource:0}: Error finding container da617d1a76c6a68d89a2f74d2d2c77e3ab30ecb04cf075e76560ae20b8b170b3: Status 404 returned error can't find the container with id da617d1a76c6a68d89a2f74d2d2c77e3ab30ecb04cf075e76560ae20b8b170b3 Apr 16 14:02:41.281707 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:41.281669 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bdd8569b5-pflnz" event={"ID":"380fbc5f-c4cb-47f2-b320-1c537ea71943","Type":"ContainerStarted","Data":"fc712d7fef9e21dfbb97197b8a86c60a6469740bdb95b659881d197d09a704af"} Apr 16 14:02:41.281707 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:41.281707 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bdd8569b5-pflnz" event={"ID":"380fbc5f-c4cb-47f2-b320-1c537ea71943","Type":"ContainerStarted","Data":"da617d1a76c6a68d89a2f74d2d2c77e3ab30ecb04cf075e76560ae20b8b170b3"} Apr 16 14:02:41.334818 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:41.334763 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5bdd8569b5-pflnz" podStartSLOduration=1.3347470860000001 podStartE2EDuration="1.334747086s" podCreationTimestamp="2026-04-16 14:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:02:41.332928699 +0000 UTC m=+196.341164966" watchObservedRunningTime="2026-04-16 14:02:41.334747086 +0000 UTC m=+196.342983341" Apr 16 14:02:50.640290 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:50.640223 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:50.640290 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:50.640298 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:50.644955 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:50.644932 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:51.313703 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:51.313664 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:02:51.383153 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:02:51.383116 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c977d945c-sppg4"] Apr 16 14:03:16.404757 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.404687 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c977d945c-sppg4" podUID="43157412-d832-43d6-bb75-2e48c9408000" containerName="console" containerID="cri-o://5aabb049ec43774485e16c43dda465a8b8a18c21a723d2ff406cef42fadd6f08" gracePeriod=15 Apr 16 14:03:16.664707 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.664684 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c977d945c-sppg4_43157412-d832-43d6-bb75-2e48c9408000/console/0.log" Apr 16 14:03:16.664832 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.664745 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:03:16.769379 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.769340 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-service-ca\") pod \"43157412-d832-43d6-bb75-2e48c9408000\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " Apr 16 14:03:16.769379 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.769394 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-console-config\") pod \"43157412-d832-43d6-bb75-2e48c9408000\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " Apr 16 14:03:16.769606 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.769442 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-trusted-ca-bundle\") pod \"43157412-d832-43d6-bb75-2e48c9408000\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " Apr 16 14:03:16.769606 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.769588 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-oauth-serving-cert\") pod \"43157412-d832-43d6-bb75-2e48c9408000\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " Apr 16 14:03:16.769698 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.769640 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43157412-d832-43d6-bb75-2e48c9408000-console-oauth-config\") pod \"43157412-d832-43d6-bb75-2e48c9408000\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " Apr 16 14:03:16.769698 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.769673 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctq9n\" (UniqueName: \"kubernetes.io/projected/43157412-d832-43d6-bb75-2e48c9408000-kube-api-access-ctq9n\") pod \"43157412-d832-43d6-bb75-2e48c9408000\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " Apr 16 14:03:16.769798 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.769701 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43157412-d832-43d6-bb75-2e48c9408000-console-serving-cert\") pod \"43157412-d832-43d6-bb75-2e48c9408000\" (UID: \"43157412-d832-43d6-bb75-2e48c9408000\") " Apr 16 14:03:16.769919 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.769814 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-service-ca" (OuterVolumeSpecName: "service-ca") pod "43157412-d832-43d6-bb75-2e48c9408000" (UID: "43157412-d832-43d6-bb75-2e48c9408000"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:16.769919 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.769880 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-console-config" (OuterVolumeSpecName: "console-config") pod "43157412-d832-43d6-bb75-2e48c9408000" (UID: "43157412-d832-43d6-bb75-2e48c9408000"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:16.770033 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.769964 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43157412-d832-43d6-bb75-2e48c9408000" (UID: "43157412-d832-43d6-bb75-2e48c9408000"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:16.770033 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.770003 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-service-ca\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:03:16.770033 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.770017 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-console-config\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:03:16.770033 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.770010 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43157412-d832-43d6-bb75-2e48c9408000" (UID: "43157412-d832-43d6-bb75-2e48c9408000"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:16.772017 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.771978 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43157412-d832-43d6-bb75-2e48c9408000-kube-api-access-ctq9n" (OuterVolumeSpecName: "kube-api-access-ctq9n") pod "43157412-d832-43d6-bb75-2e48c9408000" (UID: "43157412-d832-43d6-bb75-2e48c9408000"). InnerVolumeSpecName "kube-api-access-ctq9n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:16.772129 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.772021 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43157412-d832-43d6-bb75-2e48c9408000-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43157412-d832-43d6-bb75-2e48c9408000" (UID: "43157412-d832-43d6-bb75-2e48c9408000"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:16.772129 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.772048 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43157412-d832-43d6-bb75-2e48c9408000-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43157412-d832-43d6-bb75-2e48c9408000" (UID: "43157412-d832-43d6-bb75-2e48c9408000"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:16.870545 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.870505 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-trusted-ca-bundle\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:03:16.870545 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.870538 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43157412-d832-43d6-bb75-2e48c9408000-oauth-serving-cert\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:03:16.870545 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.870549 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43157412-d832-43d6-bb75-2e48c9408000-console-oauth-config\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:03:16.870770 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.870558 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ctq9n\" (UniqueName: \"kubernetes.io/projected/43157412-d832-43d6-bb75-2e48c9408000-kube-api-access-ctq9n\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:03:16.870770 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:16.870567 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43157412-d832-43d6-bb75-2e48c9408000-console-serving-cert\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:03:17.391851 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:17.391823 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c977d945c-sppg4_43157412-d832-43d6-bb75-2e48c9408000/console/0.log" Apr 16 14:03:17.392026 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:17.391862 2569 generic.go:358] "Generic (PLEG): container finished" podID="43157412-d832-43d6-bb75-2e48c9408000" containerID="5aabb049ec43774485e16c43dda465a8b8a18c21a723d2ff406cef42fadd6f08" exitCode=2 Apr 16 14:03:17.392026 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:17.391924 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c977d945c-sppg4" event={"ID":"43157412-d832-43d6-bb75-2e48c9408000","Type":"ContainerDied","Data":"5aabb049ec43774485e16c43dda465a8b8a18c21a723d2ff406cef42fadd6f08"} Apr 16 14:03:17.392026 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:17.391926 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c977d945c-sppg4" Apr 16 14:03:17.392026 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:17.391951 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c977d945c-sppg4" event={"ID":"43157412-d832-43d6-bb75-2e48c9408000","Type":"ContainerDied","Data":"8e19c50a231e05e4dfa2b0736b4aefc98b987d21dbbef0a072f35c8f7300977d"} Apr 16 14:03:17.392026 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:17.391968 2569 scope.go:117] "RemoveContainer" containerID="5aabb049ec43774485e16c43dda465a8b8a18c21a723d2ff406cef42fadd6f08" Apr 16 14:03:17.400091 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:17.400068 2569 scope.go:117] "RemoveContainer" containerID="5aabb049ec43774485e16c43dda465a8b8a18c21a723d2ff406cef42fadd6f08" Apr 16 14:03:17.400403 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:03:17.400381 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aabb049ec43774485e16c43dda465a8b8a18c21a723d2ff406cef42fadd6f08\": container with ID starting with 5aabb049ec43774485e16c43dda465a8b8a18c21a723d2ff406cef42fadd6f08 not found: ID does not exist" containerID="5aabb049ec43774485e16c43dda465a8b8a18c21a723d2ff406cef42fadd6f08" Apr 16 14:03:17.400476 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:17.400416 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aabb049ec43774485e16c43dda465a8b8a18c21a723d2ff406cef42fadd6f08"} err="failed to get container status \"5aabb049ec43774485e16c43dda465a8b8a18c21a723d2ff406cef42fadd6f08\": rpc error: code = NotFound desc = could not find container \"5aabb049ec43774485e16c43dda465a8b8a18c21a723d2ff406cef42fadd6f08\": container with ID starting with 5aabb049ec43774485e16c43dda465a8b8a18c21a723d2ff406cef42fadd6f08 not found: ID does not exist" Apr 16 14:03:17.420774 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:17.420736 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c977d945c-sppg4"] Apr 16 14:03:17.427969 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:17.427939 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c977d945c-sppg4"] Apr 16 14:03:17.607454 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:03:17.603689 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43157412-d832-43d6-bb75-2e48c9408000" path="/var/lib/kubelet/pods/43157412-d832-43d6-bb75-2e48c9408000/volumes" Apr 16 14:04:25.471551 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:04:25.471520 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:04:25.472463 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:04:25.472445 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:06:25.410493 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.410317 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6955878f8c-qjfhf"] Apr 16 14:06:25.411112 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.410727 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43157412-d832-43d6-bb75-2e48c9408000" containerName="console" Apr 16 14:06:25.411112 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.410743 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="43157412-d832-43d6-bb75-2e48c9408000" containerName="console" Apr 16 14:06:25.411112 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.410815 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="43157412-d832-43d6-bb75-2e48c9408000" containerName="console" Apr 16 14:06:25.413722 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.413702 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.434621 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.434592 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6955878f8c-qjfhf"] Apr 16 14:06:25.500047 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.500017 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b04d97-2b54-4049-a24d-d229a2567619-service-ca\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.500047 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.500052 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b04d97-2b54-4049-a24d-d229a2567619-oauth-serving-cert\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.500289 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.500070 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b04d97-2b54-4049-a24d-d229a2567619-console-config\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.500289 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.500099 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b04d97-2b54-4049-a24d-d229a2567619-console-serving-cert\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.500289 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.500145 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwx8m\" (UniqueName: \"kubernetes.io/projected/a1b04d97-2b54-4049-a24d-d229a2567619-kube-api-access-rwx8m\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.500289 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.500216 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b04d97-2b54-4049-a24d-d229a2567619-trusted-ca-bundle\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.500426 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.500292 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b04d97-2b54-4049-a24d-d229a2567619-console-oauth-config\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.600901 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.600864 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b04d97-2b54-4049-a24d-d229a2567619-console-serving-cert\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.601095 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.600909 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwx8m\" (UniqueName: \"kubernetes.io/projected/a1b04d97-2b54-4049-a24d-d229a2567619-kube-api-access-rwx8m\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.601095 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.600962 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b04d97-2b54-4049-a24d-d229a2567619-trusted-ca-bundle\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.601095 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.601012 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b04d97-2b54-4049-a24d-d229a2567619-console-oauth-config\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.601095 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.601045 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b04d97-2b54-4049-a24d-d229a2567619-service-ca\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.601338 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.601148 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b04d97-2b54-4049-a24d-d229a2567619-oauth-serving-cert\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.601338 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.601180 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b04d97-2b54-4049-a24d-d229a2567619-console-config\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.601861 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.601828 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b04d97-2b54-4049-a24d-d229a2567619-service-ca\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.601977 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.601881 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b04d97-2b54-4049-a24d-d229a2567619-console-config\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.602092 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.602068 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b04d97-2b54-4049-a24d-d229a2567619-oauth-serving-cert\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.602290 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.602266 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b04d97-2b54-4049-a24d-d229a2567619-trusted-ca-bundle\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.603722 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.603699 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b04d97-2b54-4049-a24d-d229a2567619-console-oauth-config\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.604348 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.604330 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b04d97-2b54-4049-a24d-d229a2567619-console-serving-cert\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.618994 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.618963 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwx8m\" (UniqueName: \"kubernetes.io/projected/a1b04d97-2b54-4049-a24d-d229a2567619-kube-api-access-rwx8m\") pod \"console-6955878f8c-qjfhf\" (UID: \"a1b04d97-2b54-4049-a24d-d229a2567619\") " pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.724824 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.724717 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:25.851107 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.851081 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6955878f8c-qjfhf"] Apr 16 14:06:25.853690 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:06:25.853663 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b04d97_2b54_4049_a24d_d229a2567619.slice/crio-778ef41e7a0eb43d6872807a01afad1ac9410c414bdbf8884fb8a714ba57679e WatchSource:0}: Error finding container 778ef41e7a0eb43d6872807a01afad1ac9410c414bdbf8884fb8a714ba57679e: Status 404 returned error can't find the container with id 778ef41e7a0eb43d6872807a01afad1ac9410c414bdbf8884fb8a714ba57679e Apr 16 14:06:25.855483 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.855467 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:06:25.919325 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.919291 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6955878f8c-qjfhf" event={"ID":"a1b04d97-2b54-4049-a24d-d229a2567619","Type":"ContainerStarted","Data":"dfb80a265a837cf2b8708968716d7c121589c35b94b4dd34ba32873fcc7fa17c"} Apr 16 14:06:25.919441 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.919334 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6955878f8c-qjfhf" event={"ID":"a1b04d97-2b54-4049-a24d-d229a2567619","Type":"ContainerStarted","Data":"778ef41e7a0eb43d6872807a01afad1ac9410c414bdbf8884fb8a714ba57679e"} Apr 16 14:06:25.940423 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:25.940349 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6955878f8c-qjfhf" podStartSLOduration=0.940328564 podStartE2EDuration="940.328564ms" podCreationTimestamp="2026-04-16 14:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:06:25.93885448 +0000 UTC m=+420.947090763" watchObservedRunningTime="2026-04-16 14:06:25.940328564 +0000 UTC m=+420.948564821" Apr 16 14:06:35.725810 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:35.725697 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:35.725810 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:35.725751 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:35.730681 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:35.730650 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:35.951139 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:35.951105 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6955878f8c-qjfhf" Apr 16 14:06:36.000748 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:06:36.000658 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5bdd8569b5-pflnz"] Apr 16 14:07:01.021999 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.021957 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5bdd8569b5-pflnz" podUID="380fbc5f-c4cb-47f2-b320-1c537ea71943" containerName="console" containerID="cri-o://fc712d7fef9e21dfbb97197b8a86c60a6469740bdb95b659881d197d09a704af" gracePeriod=15 Apr 16 14:07:01.266732 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.266708 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5bdd8569b5-pflnz_380fbc5f-c4cb-47f2-b320-1c537ea71943/console/0.log" Apr 16 14:07:01.266868 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.266775 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:07:01.404203 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.404112 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-trusted-ca-bundle\") pod \"380fbc5f-c4cb-47f2-b320-1c537ea71943\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " Apr 16 14:07:01.404203 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.404180 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-oauth-config\") pod \"380fbc5f-c4cb-47f2-b320-1c537ea71943\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " Apr 16 14:07:01.404203 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.404204 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-config\") pod \"380fbc5f-c4cb-47f2-b320-1c537ea71943\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " Apr 16 14:07:01.404505 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.404225 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-serving-cert\") pod \"380fbc5f-c4cb-47f2-b320-1c537ea71943\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " Apr 16 14:07:01.404505 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.404245 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-oauth-serving-cert\") pod \"380fbc5f-c4cb-47f2-b320-1c537ea71943\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " Apr 16 14:07:01.404505 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.404303 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-service-ca\") pod \"380fbc5f-c4cb-47f2-b320-1c537ea71943\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " Apr 16 14:07:01.404505 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.404354 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kswf\" (UniqueName: \"kubernetes.io/projected/380fbc5f-c4cb-47f2-b320-1c537ea71943-kube-api-access-6kswf\") pod \"380fbc5f-c4cb-47f2-b320-1c537ea71943\" (UID: \"380fbc5f-c4cb-47f2-b320-1c537ea71943\") " Apr 16 14:07:01.404709 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.404678 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "380fbc5f-c4cb-47f2-b320-1c537ea71943" (UID: "380fbc5f-c4cb-47f2-b320-1c537ea71943"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:07:01.404709 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.404688 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-config" (OuterVolumeSpecName: "console-config") pod "380fbc5f-c4cb-47f2-b320-1c537ea71943" (UID: "380fbc5f-c4cb-47f2-b320-1c537ea71943"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:07:01.404793 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.404719 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "380fbc5f-c4cb-47f2-b320-1c537ea71943" (UID: "380fbc5f-c4cb-47f2-b320-1c537ea71943"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:07:01.404793 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.404740 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-service-ca" (OuterVolumeSpecName: "service-ca") pod "380fbc5f-c4cb-47f2-b320-1c537ea71943" (UID: "380fbc5f-c4cb-47f2-b320-1c537ea71943"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:07:01.406511 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.406476 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/380fbc5f-c4cb-47f2-b320-1c537ea71943-kube-api-access-6kswf" (OuterVolumeSpecName: "kube-api-access-6kswf") pod "380fbc5f-c4cb-47f2-b320-1c537ea71943" (UID: "380fbc5f-c4cb-47f2-b320-1c537ea71943"). InnerVolumeSpecName "kube-api-access-6kswf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:07:01.406625 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.406523 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "380fbc5f-c4cb-47f2-b320-1c537ea71943" (UID: "380fbc5f-c4cb-47f2-b320-1c537ea71943"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:07:01.406625 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.406540 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "380fbc5f-c4cb-47f2-b320-1c537ea71943" (UID: "380fbc5f-c4cb-47f2-b320-1c537ea71943"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:07:01.505323 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.505283 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-serving-cert\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:07:01.505323 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.505313 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-oauth-serving-cert\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:07:01.505323 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.505323 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-service-ca\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:07:01.505323 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.505332 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6kswf\" (UniqueName: \"kubernetes.io/projected/380fbc5f-c4cb-47f2-b320-1c537ea71943-kube-api-access-6kswf\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:07:01.505592 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.505345 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-trusted-ca-bundle\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:07:01.505592 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.505354 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-oauth-config\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:07:01.505592 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:01.505363 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/380fbc5f-c4cb-47f2-b320-1c537ea71943-console-config\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:07:02.024115 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:02.024086 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5bdd8569b5-pflnz_380fbc5f-c4cb-47f2-b320-1c537ea71943/console/0.log" Apr 16 14:07:02.024565 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:02.024136 2569 generic.go:358] "Generic (PLEG): container finished" podID="380fbc5f-c4cb-47f2-b320-1c537ea71943" containerID="fc712d7fef9e21dfbb97197b8a86c60a6469740bdb95b659881d197d09a704af" exitCode=2 Apr 16 14:07:02.024565 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:02.024201 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bdd8569b5-pflnz" Apr 16 14:07:02.024565 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:02.024222 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bdd8569b5-pflnz" event={"ID":"380fbc5f-c4cb-47f2-b320-1c537ea71943","Type":"ContainerDied","Data":"fc712d7fef9e21dfbb97197b8a86c60a6469740bdb95b659881d197d09a704af"} Apr 16 14:07:02.024565 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:02.024282 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bdd8569b5-pflnz" event={"ID":"380fbc5f-c4cb-47f2-b320-1c537ea71943","Type":"ContainerDied","Data":"da617d1a76c6a68d89a2f74d2d2c77e3ab30ecb04cf075e76560ae20b8b170b3"} Apr 16 14:07:02.024565 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:02.024305 2569 scope.go:117] "RemoveContainer" containerID="fc712d7fef9e21dfbb97197b8a86c60a6469740bdb95b659881d197d09a704af" Apr 16 14:07:02.032422 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:02.032402 2569 scope.go:117] "RemoveContainer" containerID="fc712d7fef9e21dfbb97197b8a86c60a6469740bdb95b659881d197d09a704af" Apr 16 14:07:02.032725 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:02.032705 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc712d7fef9e21dfbb97197b8a86c60a6469740bdb95b659881d197d09a704af\": container with ID starting with fc712d7fef9e21dfbb97197b8a86c60a6469740bdb95b659881d197d09a704af not found: ID does not exist" containerID="fc712d7fef9e21dfbb97197b8a86c60a6469740bdb95b659881d197d09a704af" Apr 16 14:07:02.032770 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:02.032736 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc712d7fef9e21dfbb97197b8a86c60a6469740bdb95b659881d197d09a704af"} err="failed to get container status \"fc712d7fef9e21dfbb97197b8a86c60a6469740bdb95b659881d197d09a704af\": rpc error: code = NotFound desc = could not find container \"fc712d7fef9e21dfbb97197b8a86c60a6469740bdb95b659881d197d09a704af\": container with ID starting with fc712d7fef9e21dfbb97197b8a86c60a6469740bdb95b659881d197d09a704af not found: ID does not exist" Apr 16 14:07:02.042769 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:02.042740 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5bdd8569b5-pflnz"] Apr 16 14:07:02.047970 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:02.047947 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5bdd8569b5-pflnz"] Apr 16 14:07:03.602436 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:03.602407 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="380fbc5f-c4cb-47f2-b320-1c537ea71943" path="/var/lib/kubelet/pods/380fbc5f-c4cb-47f2-b320-1c537ea71943/volumes" Apr 16 14:07:21.838751 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.838718 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52"] Apr 16 14:07:21.839247 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.839226 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="380fbc5f-c4cb-47f2-b320-1c537ea71943" containerName="console" Apr 16 14:07:21.839339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.839266 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="380fbc5f-c4cb-47f2-b320-1c537ea71943" containerName="console" Apr 16 14:07:21.839400 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.839367 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="380fbc5f-c4cb-47f2-b320-1c537ea71943" containerName="console" Apr 16 14:07:21.842360 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.842339 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" Apr 16 14:07:21.845059 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.845035 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:07:21.846141 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.846122 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8zrx2\"" Apr 16 14:07:21.846202 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.846122 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:07:21.850238 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.850217 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52"] Apr 16 14:07:21.866084 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.866059 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbc9x\" (UniqueName: \"kubernetes.io/projected/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-kube-api-access-wbc9x\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52\" (UID: \"8d7a75e7-192c-4dce-b473-6ce47a3b1be2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" Apr 16 14:07:21.866214 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.866178 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52\" (UID: \"8d7a75e7-192c-4dce-b473-6ce47a3b1be2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" Apr 16 14:07:21.866214 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.866208 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52\" (UID: \"8d7a75e7-192c-4dce-b473-6ce47a3b1be2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" Apr 16 14:07:21.967132 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.967086 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbc9x\" (UniqueName: \"kubernetes.io/projected/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-kube-api-access-wbc9x\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52\" (UID: \"8d7a75e7-192c-4dce-b473-6ce47a3b1be2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" Apr 16 14:07:21.967364 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.967186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52\" (UID: \"8d7a75e7-192c-4dce-b473-6ce47a3b1be2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" Apr 16 14:07:21.967364 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.967215 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52\" (UID: \"8d7a75e7-192c-4dce-b473-6ce47a3b1be2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" Apr 16 14:07:21.967600 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.967577 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52\" (UID: \"8d7a75e7-192c-4dce-b473-6ce47a3b1be2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" Apr 16 14:07:21.967671 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.967610 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52\" (UID: \"8d7a75e7-192c-4dce-b473-6ce47a3b1be2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" Apr 16 14:07:21.976035 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:21.976015 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbc9x\" (UniqueName: \"kubernetes.io/projected/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-kube-api-access-wbc9x\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52\" (UID: \"8d7a75e7-192c-4dce-b473-6ce47a3b1be2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" Apr 16 14:07:22.152679 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:22.152595 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" Apr 16 14:07:22.274238 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:22.274211 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52"] Apr 16 14:07:22.276760 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:07:22.276730 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d7a75e7_192c_4dce_b473_6ce47a3b1be2.slice/crio-d49537169d4cb940f7ad9f8c421e8f59e689ca72f9781f5db37fe7c9d9d2967e WatchSource:0}: Error finding container d49537169d4cb940f7ad9f8c421e8f59e689ca72f9781f5db37fe7c9d9d2967e: Status 404 returned error can't find the container with id d49537169d4cb940f7ad9f8c421e8f59e689ca72f9781f5db37fe7c9d9d2967e Apr 16 14:07:23.091654 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:23.091610 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" event={"ID":"8d7a75e7-192c-4dce-b473-6ce47a3b1be2","Type":"ContainerStarted","Data":"d49537169d4cb940f7ad9f8c421e8f59e689ca72f9781f5db37fe7c9d9d2967e"} Apr 16 14:07:27.105287 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:27.105229 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" event={"ID":"8d7a75e7-192c-4dce-b473-6ce47a3b1be2","Type":"ContainerStarted","Data":"98b294adee2ecea7f3d7fc01e8a498c77cef4a8323ad7f2b468f4507306cba72"} Apr 16 14:07:28.110302 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:28.110246 2569 generic.go:358] "Generic (PLEG): container finished" podID="8d7a75e7-192c-4dce-b473-6ce47a3b1be2" containerID="98b294adee2ecea7f3d7fc01e8a498c77cef4a8323ad7f2b468f4507306cba72" exitCode=0 Apr 16 14:07:28.110302 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:28.110287 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" event={"ID":"8d7a75e7-192c-4dce-b473-6ce47a3b1be2","Type":"ContainerDied","Data":"98b294adee2ecea7f3d7fc01e8a498c77cef4a8323ad7f2b468f4507306cba72"} Apr 16 14:07:30.120394 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:30.120362 2569 generic.go:358] "Generic (PLEG): container finished" podID="8d7a75e7-192c-4dce-b473-6ce47a3b1be2" containerID="d37c384e654ddadc7bbfc1e2dcbd71f0853ff9623914b3ea5d1178e21254283c" exitCode=0 Apr 16 14:07:30.120819 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:30.120437 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" event={"ID":"8d7a75e7-192c-4dce-b473-6ce47a3b1be2","Type":"ContainerDied","Data":"d37c384e654ddadc7bbfc1e2dcbd71f0853ff9623914b3ea5d1178e21254283c"} Apr 16 14:07:37.144370 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:37.144332 2569 generic.go:358] "Generic (PLEG): container finished" podID="8d7a75e7-192c-4dce-b473-6ce47a3b1be2" containerID="4c8115dec864830e6f39d1d75bfc628050f45fb9628fa0a7f28b6c836e5bcdd6" exitCode=0 Apr 16 14:07:37.144798 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:37.144396 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" event={"ID":"8d7a75e7-192c-4dce-b473-6ce47a3b1be2","Type":"ContainerDied","Data":"4c8115dec864830e6f39d1d75bfc628050f45fb9628fa0a7f28b6c836e5bcdd6"} Apr 16 14:07:38.265726 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:38.265699 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" Apr 16 14:07:38.311211 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:38.311183 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-util\") pod \"8d7a75e7-192c-4dce-b473-6ce47a3b1be2\" (UID: \"8d7a75e7-192c-4dce-b473-6ce47a3b1be2\") " Apr 16 14:07:38.311399 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:38.311291 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbc9x\" (UniqueName: \"kubernetes.io/projected/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-kube-api-access-wbc9x\") pod \"8d7a75e7-192c-4dce-b473-6ce47a3b1be2\" (UID: \"8d7a75e7-192c-4dce-b473-6ce47a3b1be2\") " Apr 16 14:07:38.311399 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:38.311308 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-bundle\") pod \"8d7a75e7-192c-4dce-b473-6ce47a3b1be2\" (UID: \"8d7a75e7-192c-4dce-b473-6ce47a3b1be2\") " Apr 16 14:07:38.311912 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:38.311884 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-bundle" (OuterVolumeSpecName: "bundle") pod "8d7a75e7-192c-4dce-b473-6ce47a3b1be2" (UID: "8d7a75e7-192c-4dce-b473-6ce47a3b1be2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:07:38.313499 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:38.313474 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-kube-api-access-wbc9x" (OuterVolumeSpecName: "kube-api-access-wbc9x") pod "8d7a75e7-192c-4dce-b473-6ce47a3b1be2" (UID: "8d7a75e7-192c-4dce-b473-6ce47a3b1be2"). InnerVolumeSpecName "kube-api-access-wbc9x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:07:38.315344 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:38.315321 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-util" (OuterVolumeSpecName: "util") pod "8d7a75e7-192c-4dce-b473-6ce47a3b1be2" (UID: "8d7a75e7-192c-4dce-b473-6ce47a3b1be2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:07:38.412153 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:38.412117 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wbc9x\" (UniqueName: \"kubernetes.io/projected/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-kube-api-access-wbc9x\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:07:38.412153 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:38.412153 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-bundle\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:07:38.412376 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:38.412167 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d7a75e7-192c-4dce-b473-6ce47a3b1be2-util\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:07:39.152300 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:39.152266 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" Apr 16 14:07:39.152300 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:39.152285 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjfq52" event={"ID":"8d7a75e7-192c-4dce-b473-6ce47a3b1be2","Type":"ContainerDied","Data":"d49537169d4cb940f7ad9f8c421e8f59e689ca72f9781f5db37fe7c9d9d2967e"} Apr 16 14:07:39.152497 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:39.152317 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d49537169d4cb940f7ad9f8c421e8f59e689ca72f9781f5db37fe7c9d9d2967e" Apr 16 14:07:43.506378 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.506337 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn"] Apr 16 14:07:43.506871 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.506767 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d7a75e7-192c-4dce-b473-6ce47a3b1be2" containerName="pull" Apr 16 14:07:43.506871 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.506785 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d7a75e7-192c-4dce-b473-6ce47a3b1be2" containerName="pull" Apr 16 14:07:43.506871 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.506804 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d7a75e7-192c-4dce-b473-6ce47a3b1be2" containerName="util" Apr 16 14:07:43.506871 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.506812 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d7a75e7-192c-4dce-b473-6ce47a3b1be2" containerName="util" Apr 16 14:07:43.506871 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.506822 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d7a75e7-192c-4dce-b473-6ce47a3b1be2" containerName="extract" Apr 16 14:07:43.506871 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.506829 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d7a75e7-192c-4dce-b473-6ce47a3b1be2" containerName="extract" Apr 16 14:07:43.507171 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.506925 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d7a75e7-192c-4dce-b473-6ce47a3b1be2" containerName="extract" Apr 16 14:07:43.560529 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.560485 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn"] Apr 16 14:07:43.560711 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.560633 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn" Apr 16 14:07:43.565778 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.565753 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-j9ggk\"" Apr 16 14:07:43.565778 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.565769 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 14:07:43.565979 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.565776 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 14:07:43.566239 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.566217 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 14:07:43.657269 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.657214 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0b1a0f87-c085-4bf8-a29b-f798f5cb2528-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn\" (UID: \"0b1a0f87-c085-4bf8-a29b-f798f5cb2528\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn" Apr 16 14:07:43.657433 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.657315 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw2m8\" (UniqueName: \"kubernetes.io/projected/0b1a0f87-c085-4bf8-a29b-f798f5cb2528-kube-api-access-sw2m8\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn\" (UID: \"0b1a0f87-c085-4bf8-a29b-f798f5cb2528\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn" Apr 16 14:07:43.758609 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.758523 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sw2m8\" (UniqueName: \"kubernetes.io/projected/0b1a0f87-c085-4bf8-a29b-f798f5cb2528-kube-api-access-sw2m8\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn\" (UID: \"0b1a0f87-c085-4bf8-a29b-f798f5cb2528\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn" Apr 16 14:07:43.758609 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.758607 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0b1a0f87-c085-4bf8-a29b-f798f5cb2528-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn\" (UID: \"0b1a0f87-c085-4bf8-a29b-f798f5cb2528\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn" Apr 16 14:07:43.761064 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.761030 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0b1a0f87-c085-4bf8-a29b-f798f5cb2528-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn\" (UID: \"0b1a0f87-c085-4bf8-a29b-f798f5cb2528\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn" Apr 16 14:07:43.768373 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.768349 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw2m8\" (UniqueName: \"kubernetes.io/projected/0b1a0f87-c085-4bf8-a29b-f798f5cb2528-kube-api-access-sw2m8\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn\" (UID: \"0b1a0f87-c085-4bf8-a29b-f798f5cb2528\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn" Apr 16 14:07:43.870885 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.870844 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn" Apr 16 14:07:43.996141 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:43.996012 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn"] Apr 16 14:07:43.998797 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:07:43.998764 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b1a0f87_c085_4bf8_a29b_f798f5cb2528.slice/crio-d40a15f006322c1b489e5a27d2be946de73d35934970cdd4c4684b511f3ef3f8 WatchSource:0}: Error finding container d40a15f006322c1b489e5a27d2be946de73d35934970cdd4c4684b511f3ef3f8: Status 404 returned error can't find the container with id d40a15f006322c1b489e5a27d2be946de73d35934970cdd4c4684b511f3ef3f8 Apr 16 14:07:44.168364 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:44.168328 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn" event={"ID":"0b1a0f87-c085-4bf8-a29b-f798f5cb2528","Type":"ContainerStarted","Data":"d40a15f006322c1b489e5a27d2be946de73d35934970cdd4c4684b511f3ef3f8"} Apr 16 14:07:54.065135 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.065099 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2gcnj"] Apr 16 14:07:54.068446 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.068426 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:07:54.071199 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.071179 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 14:07:54.071331 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.071234 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 14:07:54.071384 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.071358 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-m5k4v\"" Apr 16 14:07:54.076927 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.076903 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2gcnj"] Apr 16 14:07:54.147317 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.147284 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-certificates\") pod \"keda-operator-ffbb595cb-2gcnj\" (UID: \"cbc61b38-d9d4-43c3-b74d-dd800ec25787\") " pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:07:54.147518 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.147328 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/cbc61b38-d9d4-43c3-b74d-dd800ec25787-cabundle0\") pod \"keda-operator-ffbb595cb-2gcnj\" (UID: \"cbc61b38-d9d4-43c3-b74d-dd800ec25787\") " pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:07:54.147518 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.147413 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn48p\" (UniqueName: \"kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-kube-api-access-mn48p\") pod \"keda-operator-ffbb595cb-2gcnj\" (UID: \"cbc61b38-d9d4-43c3-b74d-dd800ec25787\") " pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:07:54.202319 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.202282 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn" event={"ID":"0b1a0f87-c085-4bf8-a29b-f798f5cb2528","Type":"ContainerStarted","Data":"3d718bf6aa268145ab7ab1a631e6ca367002e0fba57277eae24454c705789efd"} Apr 16 14:07:54.202505 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.202390 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn" Apr 16 14:07:54.220928 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.220876 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn" podStartSLOduration=1.645350254 podStartE2EDuration="11.220858358s" podCreationTimestamp="2026-04-16 14:07:43 +0000 UTC" firstStartedPulling="2026-04-16 14:07:44.000500273 +0000 UTC m=+499.008736518" lastFinishedPulling="2026-04-16 14:07:53.576008387 +0000 UTC m=+508.584244622" observedRunningTime="2026-04-16 14:07:54.219284037 +0000 UTC m=+509.227520292" watchObservedRunningTime="2026-04-16 14:07:54.220858358 +0000 UTC m=+509.229094613" Apr 16 14:07:54.248067 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.248036 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-certificates\") pod \"keda-operator-ffbb595cb-2gcnj\" (UID: \"cbc61b38-d9d4-43c3-b74d-dd800ec25787\") " pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:07:54.248169 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.248080 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/cbc61b38-d9d4-43c3-b74d-dd800ec25787-cabundle0\") pod \"keda-operator-ffbb595cb-2gcnj\" (UID: \"cbc61b38-d9d4-43c3-b74d-dd800ec25787\") " pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:07:54.248169 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.248104 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mn48p\" (UniqueName: \"kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-kube-api-access-mn48p\") pod \"keda-operator-ffbb595cb-2gcnj\" (UID: \"cbc61b38-d9d4-43c3-b74d-dd800ec25787\") " pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:07:54.248237 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:54.248185 2569 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 14:07:54.248237 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:54.248210 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 14:07:54.248237 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:54.248217 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 14:07:54.248237 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:54.248229 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2gcnj: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 14:07:54.248405 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:54.248310 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-certificates podName:cbc61b38-d9d4-43c3-b74d-dd800ec25787 nodeName:}" failed. No retries permitted until 2026-04-16 14:07:54.748293706 +0000 UTC m=+509.756529953 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-certificates") pod "keda-operator-ffbb595cb-2gcnj" (UID: "cbc61b38-d9d4-43c3-b74d-dd800ec25787") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 14:07:54.248755 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.248737 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/cbc61b38-d9d4-43c3-b74d-dd800ec25787-cabundle0\") pod \"keda-operator-ffbb595cb-2gcnj\" (UID: \"cbc61b38-d9d4-43c3-b74d-dd800ec25787\") " pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:07:54.256833 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.256801 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn48p\" (UniqueName: \"kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-kube-api-access-mn48p\") pod \"keda-operator-ffbb595cb-2gcnj\" (UID: \"cbc61b38-d9d4-43c3-b74d-dd800ec25787\") " pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:07:54.724752 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.724718 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-r45q5"] Apr 16 14:07:54.728189 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.728170 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-r45q5" Apr 16 14:07:54.730611 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.730594 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 14:07:54.738791 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.738769 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-r45q5"] Apr 16 14:07:54.752122 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.752090 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-certificates\") pod \"keda-operator-ffbb595cb-2gcnj\" (UID: \"cbc61b38-d9d4-43c3-b74d-dd800ec25787\") " pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:07:54.752419 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:54.752294 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 14:07:54.752419 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:54.752319 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 14:07:54.752419 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:54.752333 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2gcnj: references non-existent secret key: ca.crt Apr 16 14:07:54.752631 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:54.752429 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-certificates podName:cbc61b38-d9d4-43c3-b74d-dd800ec25787 nodeName:}" failed. No retries permitted until 2026-04-16 14:07:55.752405027 +0000 UTC m=+510.760641291 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-certificates") pod "keda-operator-ffbb595cb-2gcnj" (UID: "cbc61b38-d9d4-43c3-b74d-dd800ec25787") : references non-existent secret key: ca.crt Apr 16 14:07:54.853234 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.853194 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0b40693d-e854-4987-aa2f-c507d256a92b-certificates\") pod \"keda-admission-cf49989db-r45q5\" (UID: \"0b40693d-e854-4987-aa2f-c507d256a92b\") " pod="openshift-keda/keda-admission-cf49989db-r45q5" Apr 16 14:07:54.853454 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.853242 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7mnq\" (UniqueName: \"kubernetes.io/projected/0b40693d-e854-4987-aa2f-c507d256a92b-kube-api-access-r7mnq\") pod \"keda-admission-cf49989db-r45q5\" (UID: \"0b40693d-e854-4987-aa2f-c507d256a92b\") " pod="openshift-keda/keda-admission-cf49989db-r45q5" Apr 16 14:07:54.954106 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.954072 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0b40693d-e854-4987-aa2f-c507d256a92b-certificates\") pod \"keda-admission-cf49989db-r45q5\" (UID: \"0b40693d-e854-4987-aa2f-c507d256a92b\") " pod="openshift-keda/keda-admission-cf49989db-r45q5" Apr 16 14:07:54.954106 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.954111 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7mnq\" (UniqueName: \"kubernetes.io/projected/0b40693d-e854-4987-aa2f-c507d256a92b-kube-api-access-r7mnq\") pod \"keda-admission-cf49989db-r45q5\" (UID: \"0b40693d-e854-4987-aa2f-c507d256a92b\") " pod="openshift-keda/keda-admission-cf49989db-r45q5" Apr 16 14:07:54.956632 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.956603 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0b40693d-e854-4987-aa2f-c507d256a92b-certificates\") pod \"keda-admission-cf49989db-r45q5\" (UID: \"0b40693d-e854-4987-aa2f-c507d256a92b\") " pod="openshift-keda/keda-admission-cf49989db-r45q5" Apr 16 14:07:54.963378 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:54.963352 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7mnq\" (UniqueName: \"kubernetes.io/projected/0b40693d-e854-4987-aa2f-c507d256a92b-kube-api-access-r7mnq\") pod \"keda-admission-cf49989db-r45q5\" (UID: \"0b40693d-e854-4987-aa2f-c507d256a92b\") " pod="openshift-keda/keda-admission-cf49989db-r45q5" Apr 16 14:07:55.038083 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:55.037989 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-r45q5" Apr 16 14:07:55.161832 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:55.161681 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-r45q5"] Apr 16 14:07:55.164073 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:07:55.164045 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b40693d_e854_4987_aa2f_c507d256a92b.slice/crio-586494e19565624a058b94052956fc65a5b4090e85bd68d65af8d66856612a54 WatchSource:0}: Error finding container 586494e19565624a058b94052956fc65a5b4090e85bd68d65af8d66856612a54: Status 404 returned error can't find the container with id 586494e19565624a058b94052956fc65a5b4090e85bd68d65af8d66856612a54 Apr 16 14:07:55.207340 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:55.207299 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-r45q5" event={"ID":"0b40693d-e854-4987-aa2f-c507d256a92b","Type":"ContainerStarted","Data":"586494e19565624a058b94052956fc65a5b4090e85bd68d65af8d66856612a54"} Apr 16 14:07:55.762088 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:55.762049 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-certificates\") pod \"keda-operator-ffbb595cb-2gcnj\" (UID: \"cbc61b38-d9d4-43c3-b74d-dd800ec25787\") " pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:07:55.762311 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:55.762195 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 14:07:55.762311 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:55.762214 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 14:07:55.762311 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:55.762225 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2gcnj: references non-existent secret key: ca.crt Apr 16 14:07:55.762311 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:55.762294 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-certificates podName:cbc61b38-d9d4-43c3-b74d-dd800ec25787 nodeName:}" failed. No retries permitted until 2026-04-16 14:07:57.762277854 +0000 UTC m=+512.770514087 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-certificates") pod "keda-operator-ffbb595cb-2gcnj" (UID: "cbc61b38-d9d4-43c3-b74d-dd800ec25787") : references non-existent secret key: ca.crt Apr 16 14:07:57.779384 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:57.779339 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-certificates\") pod \"keda-operator-ffbb595cb-2gcnj\" (UID: \"cbc61b38-d9d4-43c3-b74d-dd800ec25787\") " pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:07:57.779770 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:57.779462 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 14:07:57.779770 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:57.779473 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 14:07:57.779770 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:57.779482 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2gcnj: references non-existent secret key: ca.crt Apr 16 14:07:57.779770 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:07:57.779530 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-certificates podName:cbc61b38-d9d4-43c3-b74d-dd800ec25787 nodeName:}" failed. No retries permitted until 2026-04-16 14:08:01.779516809 +0000 UTC m=+516.787753043 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-certificates") pod "keda-operator-ffbb595cb-2gcnj" (UID: "cbc61b38-d9d4-43c3-b74d-dd800ec25787") : references non-existent secret key: ca.crt Apr 16 14:07:58.218264 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:58.218218 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-r45q5" event={"ID":"0b40693d-e854-4987-aa2f-c507d256a92b","Type":"ContainerStarted","Data":"3a0bb8ea8ab20e987094e428548c6d34135efa6926e7b286b1ceea17f0fa7df7"} Apr 16 14:07:58.218428 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:58.218391 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-r45q5" Apr 16 14:07:58.236885 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:07:58.236835 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-r45q5" podStartSLOduration=2.2015058180000002 podStartE2EDuration="4.236819396s" podCreationTimestamp="2026-04-16 14:07:54 +0000 UTC" firstStartedPulling="2026-04-16 14:07:55.165465297 +0000 UTC m=+510.173701534" lastFinishedPulling="2026-04-16 14:07:57.200778872 +0000 UTC m=+512.209015112" observedRunningTime="2026-04-16 14:07:58.234675025 +0000 UTC m=+513.242911281" watchObservedRunningTime="2026-04-16 14:07:58.236819396 +0000 UTC m=+513.245055651" Apr 16 14:08:01.814426 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:01.814397 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-certificates\") pod \"keda-operator-ffbb595cb-2gcnj\" (UID: \"cbc61b38-d9d4-43c3-b74d-dd800ec25787\") " pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:08:01.816811 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:01.816790 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cbc61b38-d9d4-43c3-b74d-dd800ec25787-certificates\") pod \"keda-operator-ffbb595cb-2gcnj\" (UID: \"cbc61b38-d9d4-43c3-b74d-dd800ec25787\") " pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:08:01.879318 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:01.879230 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:08:02.000464 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:02.000431 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2gcnj"] Apr 16 14:08:02.003235 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:08:02.003206 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbc61b38_d9d4_43c3_b74d_dd800ec25787.slice/crio-b3c13d9770eba77b4ed17c72a9f09628ec5b71dbc3f7d1db96946aafdd9a4906 WatchSource:0}: Error finding container b3c13d9770eba77b4ed17c72a9f09628ec5b71dbc3f7d1db96946aafdd9a4906: Status 404 returned error can't find the container with id b3c13d9770eba77b4ed17c72a9f09628ec5b71dbc3f7d1db96946aafdd9a4906 Apr 16 14:08:02.232124 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:02.232090 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" event={"ID":"cbc61b38-d9d4-43c3-b74d-dd800ec25787","Type":"ContainerStarted","Data":"b3c13d9770eba77b4ed17c72a9f09628ec5b71dbc3f7d1db96946aafdd9a4906"} Apr 16 14:08:06.247264 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:06.247220 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" event={"ID":"cbc61b38-d9d4-43c3-b74d-dd800ec25787","Type":"ContainerStarted","Data":"424012581d59187b64e8988bee5b4fbf39348a4f90026f16c2f244154316d6b4"} Apr 16 14:08:06.247674 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:06.247408 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:08:06.264139 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:06.264096 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" podStartSLOduration=8.833861773 podStartE2EDuration="12.264083344s" podCreationTimestamp="2026-04-16 14:07:54 +0000 UTC" firstStartedPulling="2026-04-16 14:08:02.004591705 +0000 UTC m=+517.012827938" lastFinishedPulling="2026-04-16 14:08:05.434813277 +0000 UTC m=+520.443049509" observedRunningTime="2026-04-16 14:08:06.263741488 +0000 UTC m=+521.271977744" watchObservedRunningTime="2026-04-16 14:08:06.264083344 +0000 UTC m=+521.272319599" Apr 16 14:08:15.210163 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:15.210082 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hp7gn" Apr 16 14:08:19.225400 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:19.225373 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-r45q5" Apr 16 14:08:27.253749 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:27.253718 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-2gcnj" Apr 16 14:08:47.096332 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.096288 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x"] Apr 16 14:08:47.100634 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.100605 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" Apr 16 14:08:47.104019 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.103980 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:08:47.104163 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.104050 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:08:47.105135 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.105114 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8zrx2\"" Apr 16 14:08:47.146123 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.146092 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x"] Apr 16 14:08:47.185807 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.185776 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x\" (UID: \"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" Apr 16 14:08:47.185971 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.185817 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x\" (UID: \"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" Apr 16 14:08:47.185971 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.185928 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trg4p\" (UniqueName: \"kubernetes.io/projected/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-kube-api-access-trg4p\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x\" (UID: \"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" Apr 16 14:08:47.287339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.287294 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trg4p\" (UniqueName: \"kubernetes.io/projected/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-kube-api-access-trg4p\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x\" (UID: \"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" Apr 16 14:08:47.287563 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.287374 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x\" (UID: \"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" Apr 16 14:08:47.287563 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.287398 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x\" (UID: \"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" Apr 16 14:08:47.287761 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.287743 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x\" (UID: \"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" Apr 16 14:08:47.287818 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.287786 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x\" (UID: \"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" Apr 16 14:08:47.296713 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.296684 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trg4p\" (UniqueName: \"kubernetes.io/projected/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-kube-api-access-trg4p\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x\" (UID: \"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" Apr 16 14:08:47.411124 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.411095 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" Apr 16 14:08:47.546887 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:47.546843 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x"] Apr 16 14:08:47.549842 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:08:47.549815 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bb5a034_7bf6_4023_a8fb_cedc2f1521d6.slice/crio-f139c80f0a1722947c130b17b0b421e25295d15dde4614bdc946c18b5b388a74 WatchSource:0}: Error finding container f139c80f0a1722947c130b17b0b421e25295d15dde4614bdc946c18b5b388a74: Status 404 returned error can't find the container with id f139c80f0a1722947c130b17b0b421e25295d15dde4614bdc946c18b5b388a74 Apr 16 14:08:48.382353 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:48.382309 2569 generic.go:358] "Generic (PLEG): container finished" podID="0bb5a034-7bf6-4023-a8fb-cedc2f1521d6" containerID="effc2c4c1231619c54af87e2a7c6ea9e9777881add9cdfc08b300e77e0a07fe9" exitCode=0 Apr 16 14:08:48.382735 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:48.382396 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" event={"ID":"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6","Type":"ContainerDied","Data":"effc2c4c1231619c54af87e2a7c6ea9e9777881add9cdfc08b300e77e0a07fe9"} Apr 16 14:08:48.382735 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:48.382432 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" event={"ID":"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6","Type":"ContainerStarted","Data":"f139c80f0a1722947c130b17b0b421e25295d15dde4614bdc946c18b5b388a74"} Apr 16 14:08:49.387489 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:49.387452 2569 generic.go:358] "Generic (PLEG): container finished" podID="0bb5a034-7bf6-4023-a8fb-cedc2f1521d6" containerID="d67c7e43023e0ef95a6ea75bb7781060814d31b462d9abc73aec06744f59b477" exitCode=0 Apr 16 14:08:49.387849 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:49.387530 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" event={"ID":"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6","Type":"ContainerDied","Data":"d67c7e43023e0ef95a6ea75bb7781060814d31b462d9abc73aec06744f59b477"} Apr 16 14:08:50.393074 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:50.393046 2569 generic.go:358] "Generic (PLEG): container finished" podID="0bb5a034-7bf6-4023-a8fb-cedc2f1521d6" containerID="fbf95557c0dd7877efb0c0eee88d5233d24b388e79e2529948d6d52d9f273648" exitCode=0 Apr 16 14:08:50.393485 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:50.393141 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" event={"ID":"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6","Type":"ContainerDied","Data":"fbf95557c0dd7877efb0c0eee88d5233d24b388e79e2529948d6d52d9f273648"} Apr 16 14:08:51.526462 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:51.526441 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" Apr 16 14:08:51.624470 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:51.624445 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-util\") pod \"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6\" (UID: \"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6\") " Apr 16 14:08:51.624639 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:51.624487 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trg4p\" (UniqueName: \"kubernetes.io/projected/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-kube-api-access-trg4p\") pod \"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6\" (UID: \"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6\") " Apr 16 14:08:51.624639 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:51.624537 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-bundle\") pod \"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6\" (UID: \"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6\") " Apr 16 14:08:51.625181 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:51.625149 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-bundle" (OuterVolumeSpecName: "bundle") pod "0bb5a034-7bf6-4023-a8fb-cedc2f1521d6" (UID: "0bb5a034-7bf6-4023-a8fb-cedc2f1521d6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:08:51.626713 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:51.626689 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-kube-api-access-trg4p" (OuterVolumeSpecName: "kube-api-access-trg4p") pod "0bb5a034-7bf6-4023-a8fb-cedc2f1521d6" (UID: "0bb5a034-7bf6-4023-a8fb-cedc2f1521d6"). InnerVolumeSpecName "kube-api-access-trg4p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:08:51.629691 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:51.629656 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-util" (OuterVolumeSpecName: "util") pod "0bb5a034-7bf6-4023-a8fb-cedc2f1521d6" (UID: "0bb5a034-7bf6-4023-a8fb-cedc2f1521d6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:08:51.725267 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:51.725229 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-bundle\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:08:51.725442 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:51.725278 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-util\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:08:51.725442 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:51.725287 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trg4p\" (UniqueName: \"kubernetes.io/projected/0bb5a034-7bf6-4023-a8fb-cedc2f1521d6-kube-api-access-trg4p\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:08:52.400888 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:52.400855 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" event={"ID":"0bb5a034-7bf6-4023-a8fb-cedc2f1521d6","Type":"ContainerDied","Data":"f139c80f0a1722947c130b17b0b421e25295d15dde4614bdc946c18b5b388a74"} Apr 16 14:08:52.400888 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:52.400890 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f139c80f0a1722947c130b17b0b421e25295d15dde4614bdc946c18b5b388a74" Apr 16 14:08:52.400888 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:08:52.400891 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fm82x" Apr 16 14:09:08.787046 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.787009 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97"] Apr 16 14:09:08.787575 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.787555 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bb5a034-7bf6-4023-a8fb-cedc2f1521d6" containerName="extract" Apr 16 14:09:08.787649 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.787579 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bb5a034-7bf6-4023-a8fb-cedc2f1521d6" containerName="extract" Apr 16 14:09:08.787649 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.787598 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bb5a034-7bf6-4023-a8fb-cedc2f1521d6" containerName="pull" Apr 16 14:09:08.787649 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.787607 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bb5a034-7bf6-4023-a8fb-cedc2f1521d6" containerName="pull" Apr 16 14:09:08.787649 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.787633 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bb5a034-7bf6-4023-a8fb-cedc2f1521d6" containerName="util" Apr 16 14:09:08.787649 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.787642 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bb5a034-7bf6-4023-a8fb-cedc2f1521d6" containerName="util" Apr 16 14:09:08.787917 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.787723 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bb5a034-7bf6-4023-a8fb-cedc2f1521d6" containerName="extract" Apr 16 14:09:08.790920 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.790899 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" Apr 16 14:09:08.793483 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.793460 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:09:08.794635 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.794619 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8zrx2\"" Apr 16 14:09:08.794686 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.794651 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:09:08.797955 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.797928 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97"] Apr 16 14:09:08.870653 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.870609 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97\" (UID: \"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" Apr 16 14:09:08.870848 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.870670 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgc88\" (UniqueName: \"kubernetes.io/projected/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-kube-api-access-qgc88\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97\" (UID: \"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" Apr 16 14:09:08.870848 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.870723 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97\" (UID: \"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" Apr 16 14:09:08.971547 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.971516 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgc88\" (UniqueName: \"kubernetes.io/projected/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-kube-api-access-qgc88\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97\" (UID: \"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" Apr 16 14:09:08.971678 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.971581 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97\" (UID: \"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" Apr 16 14:09:08.971678 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.971651 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97\" (UID: \"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" Apr 16 14:09:08.971972 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.971951 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97\" (UID: \"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" Apr 16 14:09:08.972013 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.971993 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97\" (UID: \"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" Apr 16 14:09:08.979825 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:08.979798 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgc88\" (UniqueName: \"kubernetes.io/projected/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-kube-api-access-qgc88\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97\" (UID: \"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" Apr 16 14:09:09.100781 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:09.100685 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" Apr 16 14:09:09.224879 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:09.224817 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97"] Apr 16 14:09:09.227084 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:09:09.227047 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1812a4fc_b9c0_4a73_9c29_0d7f7e0ed2cd.slice/crio-64cae66bc77235de1735c7378a625f45e43a8d93cf4b7fb10a8a8074721c2c78 WatchSource:0}: Error finding container 64cae66bc77235de1735c7378a625f45e43a8d93cf4b7fb10a8a8074721c2c78: Status 404 returned error can't find the container with id 64cae66bc77235de1735c7378a625f45e43a8d93cf4b7fb10a8a8074721c2c78 Apr 16 14:09:09.461932 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:09.461896 2569 generic.go:358] "Generic (PLEG): container finished" podID="1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd" containerID="316dfeeba70adc9c8a886168f5d21f25fd79ce94839dbea18145c50e5122d135" exitCode=0 Apr 16 14:09:09.462091 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:09.461954 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" event={"ID":"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd","Type":"ContainerDied","Data":"316dfeeba70adc9c8a886168f5d21f25fd79ce94839dbea18145c50e5122d135"} Apr 16 14:09:09.462091 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:09.461979 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" event={"ID":"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd","Type":"ContainerStarted","Data":"64cae66bc77235de1735c7378a625f45e43a8d93cf4b7fb10a8a8074721c2c78"} Apr 16 14:09:12.473736 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:12.473703 2569 generic.go:358] "Generic (PLEG): container finished" podID="1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd" containerID="f1c0c3de7dba6818a570702b2c6ae855dd70ca4e828b349cb9aa1918a79b13cd" exitCode=0 Apr 16 14:09:12.474106 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:12.473795 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" event={"ID":"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd","Type":"ContainerDied","Data":"f1c0c3de7dba6818a570702b2c6ae855dd70ca4e828b349cb9aa1918a79b13cd"} Apr 16 14:09:13.478434 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:13.478398 2569 generic.go:358] "Generic (PLEG): container finished" podID="1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd" containerID="ef2a6a17a6a5631dfefbbab48eaa6732c7fb41a16fdccdb809c9e18c143100f2" exitCode=0 Apr 16 14:09:13.478872 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:13.478458 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" event={"ID":"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd","Type":"ContainerDied","Data":"ef2a6a17a6a5631dfefbbab48eaa6732c7fb41a16fdccdb809c9e18c143100f2"} Apr 16 14:09:14.609305 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:14.609280 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" Apr 16 14:09:14.724323 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:14.724240 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgc88\" (UniqueName: \"kubernetes.io/projected/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-kube-api-access-qgc88\") pod \"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd\" (UID: \"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd\") " Apr 16 14:09:14.724323 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:14.724331 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-bundle\") pod \"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd\" (UID: \"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd\") " Apr 16 14:09:14.724523 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:14.724431 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-util\") pod \"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd\" (UID: \"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd\") " Apr 16 14:09:14.724777 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:14.724752 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-bundle" (OuterVolumeSpecName: "bundle") pod "1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd" (UID: "1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:09:14.726502 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:14.726468 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-kube-api-access-qgc88" (OuterVolumeSpecName: "kube-api-access-qgc88") pod "1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd" (UID: "1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd"). InnerVolumeSpecName "kube-api-access-qgc88". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:09:14.729389 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:14.729352 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-util" (OuterVolumeSpecName: "util") pod "1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd" (UID: "1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:09:14.825932 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:14.825838 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-util\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:09:14.825932 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:14.825868 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qgc88\" (UniqueName: \"kubernetes.io/projected/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-kube-api-access-qgc88\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:09:14.825932 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:14.825879 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd-bundle\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:09:15.486241 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:15.486209 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" Apr 16 14:09:15.486428 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:15.486205 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fdqm97" event={"ID":"1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd","Type":"ContainerDied","Data":"64cae66bc77235de1735c7378a625f45e43a8d93cf4b7fb10a8a8074721c2c78"} Apr 16 14:09:15.486428 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:15.486286 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64cae66bc77235de1735c7378a625f45e43a8d93cf4b7fb10a8a8074721c2c78" Apr 16 14:09:25.501713 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:25.501682 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:09:25.502233 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:25.501910 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:09:39.216437 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.216394 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc"] Apr 16 14:09:39.216818 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.216714 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd" containerName="util" Apr 16 14:09:39.216818 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.216724 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd" containerName="util" Apr 16 14:09:39.216818 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.216743 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd" containerName="pull" Apr 16 14:09:39.216818 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.216749 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd" containerName="pull" Apr 16 14:09:39.216818 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.216763 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd" containerName="extract" Apr 16 14:09:39.216818 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.216768 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd" containerName="extract" Apr 16 14:09:39.216818 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.216816 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1812a4fc-b9c0-4a73-9c29-0d7f7e0ed2cd" containerName="extract" Apr 16 14:09:39.224040 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.224016 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" Apr 16 14:09:39.226740 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.226706 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:09:39.227026 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.227005 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc"] Apr 16 14:09:39.227952 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.227922 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:09:39.228069 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.228010 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8zrx2\"" Apr 16 14:09:39.335247 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.335209 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc\" (UID: \"ed83c97b-6b86-4702-9cb4-5b094a1f4e40\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" Apr 16 14:09:39.335247 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.335269 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc\" (UID: \"ed83c97b-6b86-4702-9cb4-5b094a1f4e40\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" Apr 16 14:09:39.335247 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.335290 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz7bb\" (UniqueName: \"kubernetes.io/projected/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-kube-api-access-tz7bb\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc\" (UID: \"ed83c97b-6b86-4702-9cb4-5b094a1f4e40\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" Apr 16 14:09:39.436691 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.436655 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc\" (UID: \"ed83c97b-6b86-4702-9cb4-5b094a1f4e40\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" Apr 16 14:09:39.436871 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.436698 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc\" (UID: \"ed83c97b-6b86-4702-9cb4-5b094a1f4e40\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" Apr 16 14:09:39.436871 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.436720 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tz7bb\" (UniqueName: \"kubernetes.io/projected/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-kube-api-access-tz7bb\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc\" (UID: \"ed83c97b-6b86-4702-9cb4-5b094a1f4e40\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" Apr 16 14:09:39.437129 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.437106 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc\" (UID: \"ed83c97b-6b86-4702-9cb4-5b094a1f4e40\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" Apr 16 14:09:39.437165 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.437112 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc\" (UID: \"ed83c97b-6b86-4702-9cb4-5b094a1f4e40\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" Apr 16 14:09:39.445038 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.445016 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz7bb\" (UniqueName: \"kubernetes.io/projected/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-kube-api-access-tz7bb\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc\" (UID: \"ed83c97b-6b86-4702-9cb4-5b094a1f4e40\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" Apr 16 14:09:39.534685 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.534594 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" Apr 16 14:09:39.654733 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:39.654711 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc"] Apr 16 14:09:39.656898 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:09:39.656870 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded83c97b_6b86_4702_9cb4_5b094a1f4e40.slice/crio-6b31166971bf608ff94bc105c00b9d38865166ec276a8205c1436a81a37bd305 WatchSource:0}: Error finding container 6b31166971bf608ff94bc105c00b9d38865166ec276a8205c1436a81a37bd305: Status 404 returned error can't find the container with id 6b31166971bf608ff94bc105c00b9d38865166ec276a8205c1436a81a37bd305 Apr 16 14:09:40.568220 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:40.568188 2569 generic.go:358] "Generic (PLEG): container finished" podID="ed83c97b-6b86-4702-9cb4-5b094a1f4e40" containerID="afc0b1908aa4866f178b5319df6a1a53c847a4d75da789602713eac396bdb821" exitCode=0 Apr 16 14:09:40.568220 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:40.568225 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" event={"ID":"ed83c97b-6b86-4702-9cb4-5b094a1f4e40","Type":"ContainerDied","Data":"afc0b1908aa4866f178b5319df6a1a53c847a4d75da789602713eac396bdb821"} Apr 16 14:09:40.568654 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:40.568265 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" event={"ID":"ed83c97b-6b86-4702-9cb4-5b094a1f4e40","Type":"ContainerStarted","Data":"6b31166971bf608ff94bc105c00b9d38865166ec276a8205c1436a81a37bd305"} Apr 16 14:09:41.579059 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:41.579024 2569 generic.go:358] "Generic (PLEG): container finished" podID="ed83c97b-6b86-4702-9cb4-5b094a1f4e40" containerID="4507aecfe738dc2de4fd0c16e4d745ff3fd4b429ebdcd80b5fe0b782c13ade48" exitCode=0 Apr 16 14:09:41.579507 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:41.579134 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" event={"ID":"ed83c97b-6b86-4702-9cb4-5b094a1f4e40","Type":"ContainerDied","Data":"4507aecfe738dc2de4fd0c16e4d745ff3fd4b429ebdcd80b5fe0b782c13ade48"} Apr 16 14:09:42.585797 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:42.585759 2569 generic.go:358] "Generic (PLEG): container finished" podID="ed83c97b-6b86-4702-9cb4-5b094a1f4e40" containerID="08a65dd789fad53ad3efdac7af8fc334f2094d720e03d12b3899e0399503b83b" exitCode=0 Apr 16 14:09:42.586225 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:42.585837 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" event={"ID":"ed83c97b-6b86-4702-9cb4-5b094a1f4e40","Type":"ContainerDied","Data":"08a65dd789fad53ad3efdac7af8fc334f2094d720e03d12b3899e0399503b83b"} Apr 16 14:09:43.715757 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:43.715732 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" Apr 16 14:09:43.775950 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:43.775919 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz7bb\" (UniqueName: \"kubernetes.io/projected/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-kube-api-access-tz7bb\") pod \"ed83c97b-6b86-4702-9cb4-5b094a1f4e40\" (UID: \"ed83c97b-6b86-4702-9cb4-5b094a1f4e40\") " Apr 16 14:09:43.776110 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:43.775990 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-util\") pod \"ed83c97b-6b86-4702-9cb4-5b094a1f4e40\" (UID: \"ed83c97b-6b86-4702-9cb4-5b094a1f4e40\") " Apr 16 14:09:43.776110 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:43.776040 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-bundle\") pod \"ed83c97b-6b86-4702-9cb4-5b094a1f4e40\" (UID: \"ed83c97b-6b86-4702-9cb4-5b094a1f4e40\") " Apr 16 14:09:43.776850 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:43.776823 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-bundle" (OuterVolumeSpecName: "bundle") pod "ed83c97b-6b86-4702-9cb4-5b094a1f4e40" (UID: "ed83c97b-6b86-4702-9cb4-5b094a1f4e40"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:09:43.778048 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:43.778027 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-kube-api-access-tz7bb" (OuterVolumeSpecName: "kube-api-access-tz7bb") pod "ed83c97b-6b86-4702-9cb4-5b094a1f4e40" (UID: "ed83c97b-6b86-4702-9cb4-5b094a1f4e40"). InnerVolumeSpecName "kube-api-access-tz7bb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:09:43.781448 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:43.781421 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-util" (OuterVolumeSpecName: "util") pod "ed83c97b-6b86-4702-9cb4-5b094a1f4e40" (UID: "ed83c97b-6b86-4702-9cb4-5b094a1f4e40"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:09:43.877077 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:43.876997 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-util\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:09:43.877077 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:43.877022 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-bundle\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:09:43.877077 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:43.877031 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tz7bb\" (UniqueName: \"kubernetes.io/projected/ed83c97b-6b86-4702-9cb4-5b094a1f4e40-kube-api-access-tz7bb\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:09:44.594873 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:44.594839 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" event={"ID":"ed83c97b-6b86-4702-9cb4-5b094a1f4e40","Type":"ContainerDied","Data":"6b31166971bf608ff94bc105c00b9d38865166ec276a8205c1436a81a37bd305"} Apr 16 14:09:44.594873 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:44.594875 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b31166971bf608ff94bc105c00b9d38865166ec276a8205c1436a81a37bd305" Apr 16 14:09:44.595078 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:44.594886 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fthpc" Apr 16 14:09:53.877182 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.877151 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm"] Apr 16 14:09:53.877551 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.877486 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed83c97b-6b86-4702-9cb4-5b094a1f4e40" containerName="extract" Apr 16 14:09:53.877551 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.877498 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed83c97b-6b86-4702-9cb4-5b094a1f4e40" containerName="extract" Apr 16 14:09:53.877551 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.877517 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed83c97b-6b86-4702-9cb4-5b094a1f4e40" containerName="util" Apr 16 14:09:53.877551 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.877522 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed83c97b-6b86-4702-9cb4-5b094a1f4e40" containerName="util" Apr 16 14:09:53.877551 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.877529 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed83c97b-6b86-4702-9cb4-5b094a1f4e40" containerName="pull" Apr 16 14:09:53.877551 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.877535 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed83c97b-6b86-4702-9cb4-5b094a1f4e40" containerName="pull" Apr 16 14:09:53.877754 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.877586 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed83c97b-6b86-4702-9cb4-5b094a1f4e40" containerName="extract" Apr 16 14:09:53.880589 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.880573 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" Apr 16 14:09:53.883479 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.883456 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:09:53.883581 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.883456 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:09:53.884507 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.884490 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8zrx2\"" Apr 16 14:09:53.890139 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.890117 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm"] Apr 16 14:09:53.958354 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.958314 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vstjt\" (UniqueName: \"kubernetes.io/projected/ad4b0f0a-8af6-461e-bb14-192fae2b7327-kube-api-access-vstjt\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm\" (UID: \"ad4b0f0a-8af6-461e-bb14-192fae2b7327\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" Apr 16 14:09:53.958519 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.958379 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ad4b0f0a-8af6-461e-bb14-192fae2b7327-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm\" (UID: \"ad4b0f0a-8af6-461e-bb14-192fae2b7327\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" Apr 16 14:09:53.958519 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:53.958420 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ad4b0f0a-8af6-461e-bb14-192fae2b7327-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm\" (UID: \"ad4b0f0a-8af6-461e-bb14-192fae2b7327\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" Apr 16 14:09:54.059307 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:54.059274 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ad4b0f0a-8af6-461e-bb14-192fae2b7327-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm\" (UID: \"ad4b0f0a-8af6-461e-bb14-192fae2b7327\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" Apr 16 14:09:54.059307 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:54.059311 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ad4b0f0a-8af6-461e-bb14-192fae2b7327-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm\" (UID: \"ad4b0f0a-8af6-461e-bb14-192fae2b7327\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" Apr 16 14:09:54.059557 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:54.059369 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vstjt\" (UniqueName: \"kubernetes.io/projected/ad4b0f0a-8af6-461e-bb14-192fae2b7327-kube-api-access-vstjt\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm\" (UID: \"ad4b0f0a-8af6-461e-bb14-192fae2b7327\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" Apr 16 14:09:54.059691 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:54.059669 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ad4b0f0a-8af6-461e-bb14-192fae2b7327-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm\" (UID: \"ad4b0f0a-8af6-461e-bb14-192fae2b7327\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" Apr 16 14:09:54.059766 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:54.059730 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ad4b0f0a-8af6-461e-bb14-192fae2b7327-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm\" (UID: \"ad4b0f0a-8af6-461e-bb14-192fae2b7327\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" Apr 16 14:09:54.068122 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:54.068101 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vstjt\" (UniqueName: \"kubernetes.io/projected/ad4b0f0a-8af6-461e-bb14-192fae2b7327-kube-api-access-vstjt\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm\" (UID: \"ad4b0f0a-8af6-461e-bb14-192fae2b7327\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" Apr 16 14:09:54.190997 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:54.190965 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" Apr 16 14:09:54.325083 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:54.324997 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm"] Apr 16 14:09:54.327551 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:09:54.327525 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad4b0f0a_8af6_461e_bb14_192fae2b7327.slice/crio-cab8a249ecd88efc58c9eeb55c085963f0c545776717e392159cc2de90032943 WatchSource:0}: Error finding container cab8a249ecd88efc58c9eeb55c085963f0c545776717e392159cc2de90032943: Status 404 returned error can't find the container with id cab8a249ecd88efc58c9eeb55c085963f0c545776717e392159cc2de90032943 Apr 16 14:09:54.626841 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:54.626751 2569 generic.go:358] "Generic (PLEG): container finished" podID="ad4b0f0a-8af6-461e-bb14-192fae2b7327" containerID="fd404c2bd8d9d7df7c025c8b64d76552fb3d261b9737d4c995018e60025414e6" exitCode=0 Apr 16 14:09:54.627016 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:54.626841 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" event={"ID":"ad4b0f0a-8af6-461e-bb14-192fae2b7327","Type":"ContainerDied","Data":"fd404c2bd8d9d7df7c025c8b64d76552fb3d261b9737d4c995018e60025414e6"} Apr 16 14:09:54.627016 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:54.626875 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" event={"ID":"ad4b0f0a-8af6-461e-bb14-192fae2b7327","Type":"ContainerStarted","Data":"cab8a249ecd88efc58c9eeb55c085963f0c545776717e392159cc2de90032943"} Apr 16 14:09:55.630916 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.630888 2569 generic.go:358] "Generic (PLEG): container finished" podID="ad4b0f0a-8af6-461e-bb14-192fae2b7327" containerID="02062e95604546289d552fb552203a34ef6a6ab4c3599e34fdda48babb2cbf7f" exitCode=0 Apr 16 14:09:55.631367 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.630959 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" event={"ID":"ad4b0f0a-8af6-461e-bb14-192fae2b7327","Type":"ContainerDied","Data":"02062e95604546289d552fb552203a34ef6a6ab4c3599e34fdda48babb2cbf7f"} Apr 16 14:09:55.742824 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.742754 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z"] Apr 16 14:09:55.746056 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.746032 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:09:55.749380 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.749345 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 14:09:55.749380 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.749375 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 14:09:55.749584 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.749382 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:09:55.749584 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.749345 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 14:09:55.749751 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.749728 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-5tl54\"" Apr 16 14:09:55.749944 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.749930 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 14:09:55.765395 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.765371 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z"] Apr 16 14:09:55.875271 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.875223 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/afa03934-ff8c-4542-a83b-ae7567abef53-metrics-cert\") pod \"lws-controller-manager-dc77c844c-d6p2z\" (UID: \"afa03934-ff8c-4542-a83b-ae7567abef53\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:09:55.875420 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.875338 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktplb\" (UniqueName: \"kubernetes.io/projected/afa03934-ff8c-4542-a83b-ae7567abef53-kube-api-access-ktplb\") pod \"lws-controller-manager-dc77c844c-d6p2z\" (UID: \"afa03934-ff8c-4542-a83b-ae7567abef53\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:09:55.875420 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.875396 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/afa03934-ff8c-4542-a83b-ae7567abef53-manager-config\") pod \"lws-controller-manager-dc77c844c-d6p2z\" (UID: \"afa03934-ff8c-4542-a83b-ae7567abef53\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:09:55.875504 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.875419 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afa03934-ff8c-4542-a83b-ae7567abef53-cert\") pod \"lws-controller-manager-dc77c844c-d6p2z\" (UID: \"afa03934-ff8c-4542-a83b-ae7567abef53\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:09:55.976467 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.976438 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/afa03934-ff8c-4542-a83b-ae7567abef53-manager-config\") pod \"lws-controller-manager-dc77c844c-d6p2z\" (UID: \"afa03934-ff8c-4542-a83b-ae7567abef53\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:09:55.976467 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.976475 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afa03934-ff8c-4542-a83b-ae7567abef53-cert\") pod \"lws-controller-manager-dc77c844c-d6p2z\" (UID: \"afa03934-ff8c-4542-a83b-ae7567abef53\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:09:55.976676 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.976512 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/afa03934-ff8c-4542-a83b-ae7567abef53-metrics-cert\") pod \"lws-controller-manager-dc77c844c-d6p2z\" (UID: \"afa03934-ff8c-4542-a83b-ae7567abef53\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:09:55.976676 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.976554 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktplb\" (UniqueName: \"kubernetes.io/projected/afa03934-ff8c-4542-a83b-ae7567abef53-kube-api-access-ktplb\") pod \"lws-controller-manager-dc77c844c-d6p2z\" (UID: \"afa03934-ff8c-4542-a83b-ae7567abef53\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:09:55.982995 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.977535 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/afa03934-ff8c-4542-a83b-ae7567abef53-manager-config\") pod \"lws-controller-manager-dc77c844c-d6p2z\" (UID: \"afa03934-ff8c-4542-a83b-ae7567abef53\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:09:55.982995 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.979901 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/afa03934-ff8c-4542-a83b-ae7567abef53-metrics-cert\") pod \"lws-controller-manager-dc77c844c-d6p2z\" (UID: \"afa03934-ff8c-4542-a83b-ae7567abef53\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:09:55.983862 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.983840 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afa03934-ff8c-4542-a83b-ae7567abef53-cert\") pod \"lws-controller-manager-dc77c844c-d6p2z\" (UID: \"afa03934-ff8c-4542-a83b-ae7567abef53\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:09:55.991171 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:55.991145 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktplb\" (UniqueName: \"kubernetes.io/projected/afa03934-ff8c-4542-a83b-ae7567abef53-kube-api-access-ktplb\") pod \"lws-controller-manager-dc77c844c-d6p2z\" (UID: \"afa03934-ff8c-4542-a83b-ae7567abef53\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:09:56.055883 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:56.055805 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:09:56.188882 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:56.188838 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z"] Apr 16 14:09:56.193634 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:09:56.193607 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafa03934_ff8c_4542_a83b_ae7567abef53.slice/crio-a140f45b1584240e3adea46d441fc8a016a10c70c890302fc63ae7524df3150d WatchSource:0}: Error finding container a140f45b1584240e3adea46d441fc8a016a10c70c890302fc63ae7524df3150d: Status 404 returned error can't find the container with id a140f45b1584240e3adea46d441fc8a016a10c70c890302fc63ae7524df3150d Apr 16 14:09:56.635153 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:56.635094 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" event={"ID":"afa03934-ff8c-4542-a83b-ae7567abef53","Type":"ContainerStarted","Data":"a140f45b1584240e3adea46d441fc8a016a10c70c890302fc63ae7524df3150d"} Apr 16 14:09:56.636839 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:56.636817 2569 generic.go:358] "Generic (PLEG): container finished" podID="ad4b0f0a-8af6-461e-bb14-192fae2b7327" containerID="eed84ff58a0d92580e8e0d1b4aa725581c180ca34708352fbc178ad18ad7a7fb" exitCode=0 Apr 16 14:09:56.636949 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:56.636856 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" event={"ID":"ad4b0f0a-8af6-461e-bb14-192fae2b7327","Type":"ContainerDied","Data":"eed84ff58a0d92580e8e0d1b4aa725581c180ca34708352fbc178ad18ad7a7fb"} Apr 16 14:09:57.787450 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:57.787421 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" Apr 16 14:09:57.892106 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:57.892070 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ad4b0f0a-8af6-461e-bb14-192fae2b7327-bundle\") pod \"ad4b0f0a-8af6-461e-bb14-192fae2b7327\" (UID: \"ad4b0f0a-8af6-461e-bb14-192fae2b7327\") " Apr 16 14:09:57.892368 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:57.892115 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vstjt\" (UniqueName: \"kubernetes.io/projected/ad4b0f0a-8af6-461e-bb14-192fae2b7327-kube-api-access-vstjt\") pod \"ad4b0f0a-8af6-461e-bb14-192fae2b7327\" (UID: \"ad4b0f0a-8af6-461e-bb14-192fae2b7327\") " Apr 16 14:09:57.892368 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:57.892202 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ad4b0f0a-8af6-461e-bb14-192fae2b7327-util\") pod \"ad4b0f0a-8af6-461e-bb14-192fae2b7327\" (UID: \"ad4b0f0a-8af6-461e-bb14-192fae2b7327\") " Apr 16 14:09:57.893271 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:57.893182 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad4b0f0a-8af6-461e-bb14-192fae2b7327-bundle" (OuterVolumeSpecName: "bundle") pod "ad4b0f0a-8af6-461e-bb14-192fae2b7327" (UID: "ad4b0f0a-8af6-461e-bb14-192fae2b7327"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:09:57.894800 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:57.894772 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad4b0f0a-8af6-461e-bb14-192fae2b7327-kube-api-access-vstjt" (OuterVolumeSpecName: "kube-api-access-vstjt") pod "ad4b0f0a-8af6-461e-bb14-192fae2b7327" (UID: "ad4b0f0a-8af6-461e-bb14-192fae2b7327"). InnerVolumeSpecName "kube-api-access-vstjt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:09:57.898876 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:57.898833 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad4b0f0a-8af6-461e-bb14-192fae2b7327-util" (OuterVolumeSpecName: "util") pod "ad4b0f0a-8af6-461e-bb14-192fae2b7327" (UID: "ad4b0f0a-8af6-461e-bb14-192fae2b7327"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:09:57.992891 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:57.992858 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ad4b0f0a-8af6-461e-bb14-192fae2b7327-bundle\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:09:57.992891 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:57.992887 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vstjt\" (UniqueName: \"kubernetes.io/projected/ad4b0f0a-8af6-461e-bb14-192fae2b7327-kube-api-access-vstjt\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:09:57.992891 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:57.992901 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ad4b0f0a-8af6-461e-bb14-192fae2b7327-util\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:09:58.647271 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:58.647159 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" event={"ID":"ad4b0f0a-8af6-461e-bb14-192fae2b7327","Type":"ContainerDied","Data":"cab8a249ecd88efc58c9eeb55c085963f0c545776717e392159cc2de90032943"} Apr 16 14:09:58.647271 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:58.647207 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cab8a249ecd88efc58c9eeb55c085963f0c545776717e392159cc2de90032943" Apr 16 14:09:58.647271 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:58.647180 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gtpcm" Apr 16 14:09:58.648526 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:58.648495 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" event={"ID":"afa03934-ff8c-4542-a83b-ae7567abef53","Type":"ContainerStarted","Data":"9013effc390de1ae5460e729190a69dafa39385b3a2bd4020dc78e86e9aa4cb2"} Apr 16 14:09:58.648650 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:58.648615 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:09:58.689628 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:09:58.689586 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" podStartSLOduration=1.5072568259999999 podStartE2EDuration="3.689572567s" podCreationTimestamp="2026-04-16 14:09:55 +0000 UTC" firstStartedPulling="2026-04-16 14:09:56.195453856 +0000 UTC m=+631.203690094" lastFinishedPulling="2026-04-16 14:09:58.377769599 +0000 UTC m=+633.386005835" observedRunningTime="2026-04-16 14:09:58.687416306 +0000 UTC m=+633.695652554" watchObservedRunningTime="2026-04-16 14:09:58.689572567 +0000 UTC m=+633.697808823" Apr 16 14:10:09.654234 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:09.654202 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-dc77c844c-d6p2z" Apr 16 14:10:39.740772 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.740733 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r"] Apr 16 14:10:39.741187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.741059 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad4b0f0a-8af6-461e-bb14-192fae2b7327" containerName="extract" Apr 16 14:10:39.741187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.741071 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4b0f0a-8af6-461e-bb14-192fae2b7327" containerName="extract" Apr 16 14:10:39.741187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.741094 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad4b0f0a-8af6-461e-bb14-192fae2b7327" containerName="util" Apr 16 14:10:39.741187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.741099 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4b0f0a-8af6-461e-bb14-192fae2b7327" containerName="util" Apr 16 14:10:39.741187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.741105 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad4b0f0a-8af6-461e-bb14-192fae2b7327" containerName="pull" Apr 16 14:10:39.741187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.741110 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4b0f0a-8af6-461e-bb14-192fae2b7327" containerName="pull" Apr 16 14:10:39.741187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.741164 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad4b0f0a-8af6-461e-bb14-192fae2b7327" containerName="extract" Apr 16 14:10:39.743801 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.743782 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" Apr 16 14:10:39.746413 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.746390 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:10:39.747490 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.747467 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8zrx2\"" Apr 16 14:10:39.747490 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.747481 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:10:39.752478 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.752452 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r"] Apr 16 14:10:39.805060 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.805028 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd"] Apr 16 14:10:39.807470 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.807454 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" Apr 16 14:10:39.817215 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.817192 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd"] Apr 16 14:10:39.847628 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.847583 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edea2ef2-f9c5-4e97-a326-4ef339743784-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r\" (UID: \"edea2ef2-f9c5-4e97-a326-4ef339743784\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" Apr 16 14:10:39.847628 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.847630 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edea2ef2-f9c5-4e97-a326-4ef339743784-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r\" (UID: \"edea2ef2-f9c5-4e97-a326-4ef339743784\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" Apr 16 14:10:39.847846 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.847703 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br9fw\" (UniqueName: \"kubernetes.io/projected/edea2ef2-f9c5-4e97-a326-4ef339743784-kube-api-access-br9fw\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r\" (UID: \"edea2ef2-f9c5-4e97-a326-4ef339743784\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" Apr 16 14:10:39.907679 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.907646 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc"] Apr 16 14:10:39.910228 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.910210 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" Apr 16 14:10:39.919491 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.919465 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc"] Apr 16 14:10:39.948963 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.948931 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edea2ef2-f9c5-4e97-a326-4ef339743784-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r\" (UID: \"edea2ef2-f9c5-4e97-a326-4ef339743784\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" Apr 16 14:10:39.948963 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.948964 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edea2ef2-f9c5-4e97-a326-4ef339743784-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r\" (UID: \"edea2ef2-f9c5-4e97-a326-4ef339743784\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" Apr 16 14:10:39.949179 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.948995 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75da46c0-8d62-40ef-a683-87972dde559f-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd\" (UID: \"75da46c0-8d62-40ef-a683-87972dde559f\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" Apr 16 14:10:39.949179 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.949037 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75da46c0-8d62-40ef-a683-87972dde559f-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd\" (UID: \"75da46c0-8d62-40ef-a683-87972dde559f\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" Apr 16 14:10:39.949179 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.949074 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-br9fw\" (UniqueName: \"kubernetes.io/projected/edea2ef2-f9c5-4e97-a326-4ef339743784-kube-api-access-br9fw\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r\" (UID: \"edea2ef2-f9c5-4e97-a326-4ef339743784\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" Apr 16 14:10:39.949179 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.949102 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c8hx\" (UniqueName: \"kubernetes.io/projected/75da46c0-8d62-40ef-a683-87972dde559f-kube-api-access-4c8hx\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd\" (UID: \"75da46c0-8d62-40ef-a683-87972dde559f\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" Apr 16 14:10:39.949411 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.949388 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edea2ef2-f9c5-4e97-a326-4ef339743784-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r\" (UID: \"edea2ef2-f9c5-4e97-a326-4ef339743784\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" Apr 16 14:10:39.949450 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.949387 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edea2ef2-f9c5-4e97-a326-4ef339743784-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r\" (UID: \"edea2ef2-f9c5-4e97-a326-4ef339743784\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" Apr 16 14:10:39.957631 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:39.957603 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-br9fw\" (UniqueName: \"kubernetes.io/projected/edea2ef2-f9c5-4e97-a326-4ef339743784-kube-api-access-br9fw\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r\" (UID: \"edea2ef2-f9c5-4e97-a326-4ef339743784\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" Apr 16 14:10:40.009773 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.009686 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4"] Apr 16 14:10:40.012470 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.012453 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" Apr 16 14:10:40.021106 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.021085 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4"] Apr 16 14:10:40.050471 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.050433 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3dfe1e36-48e4-4f7c-897e-928638961d80-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc\" (UID: \"3dfe1e36-48e4-4f7c-897e-928638961d80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" Apr 16 14:10:40.050635 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.050501 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d9kq\" (UniqueName: \"kubernetes.io/projected/3dfe1e36-48e4-4f7c-897e-928638961d80-kube-api-access-8d9kq\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc\" (UID: \"3dfe1e36-48e4-4f7c-897e-928638961d80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" Apr 16 14:10:40.050635 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.050530 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75da46c0-8d62-40ef-a683-87972dde559f-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd\" (UID: \"75da46c0-8d62-40ef-a683-87972dde559f\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" Apr 16 14:10:40.050635 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.050567 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75da46c0-8d62-40ef-a683-87972dde559f-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd\" (UID: \"75da46c0-8d62-40ef-a683-87972dde559f\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" Apr 16 14:10:40.050741 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.050678 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3dfe1e36-48e4-4f7c-897e-928638961d80-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc\" (UID: \"3dfe1e36-48e4-4f7c-897e-928638961d80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" Apr 16 14:10:40.050741 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.050724 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4c8hx\" (UniqueName: \"kubernetes.io/projected/75da46c0-8d62-40ef-a683-87972dde559f-kube-api-access-4c8hx\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd\" (UID: \"75da46c0-8d62-40ef-a683-87972dde559f\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" Apr 16 14:10:40.050858 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.050845 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75da46c0-8d62-40ef-a683-87972dde559f-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd\" (UID: \"75da46c0-8d62-40ef-a683-87972dde559f\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" Apr 16 14:10:40.050938 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.050921 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75da46c0-8d62-40ef-a683-87972dde559f-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd\" (UID: \"75da46c0-8d62-40ef-a683-87972dde559f\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" Apr 16 14:10:40.053351 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.053331 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" Apr 16 14:10:40.059787 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.059760 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c8hx\" (UniqueName: \"kubernetes.io/projected/75da46c0-8d62-40ef-a683-87972dde559f-kube-api-access-4c8hx\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd\" (UID: \"75da46c0-8d62-40ef-a683-87972dde559f\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" Apr 16 14:10:40.116465 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.116367 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" Apr 16 14:10:40.151901 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.151864 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3dfe1e36-48e4-4f7c-897e-928638961d80-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc\" (UID: \"3dfe1e36-48e4-4f7c-897e-928638961d80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" Apr 16 14:10:40.152054 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.151927 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3dfe1e36-48e4-4f7c-897e-928638961d80-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc\" (UID: \"3dfe1e36-48e4-4f7c-897e-928638961d80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" Apr 16 14:10:40.152054 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.151975 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zg6h\" (UniqueName: \"kubernetes.io/projected/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-kube-api-access-5zg6h\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4\" (UID: \"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" Apr 16 14:10:40.152054 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.152006 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4\" (UID: \"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" Apr 16 14:10:40.152054 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.152026 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4\" (UID: \"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" Apr 16 14:10:40.152346 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.152073 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8d9kq\" (UniqueName: \"kubernetes.io/projected/3dfe1e36-48e4-4f7c-897e-928638961d80-kube-api-access-8d9kq\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc\" (UID: \"3dfe1e36-48e4-4f7c-897e-928638961d80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" Apr 16 14:10:40.152474 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.152452 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3dfe1e36-48e4-4f7c-897e-928638961d80-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc\" (UID: \"3dfe1e36-48e4-4f7c-897e-928638961d80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" Apr 16 14:10:40.152624 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.152602 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3dfe1e36-48e4-4f7c-897e-928638961d80-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc\" (UID: \"3dfe1e36-48e4-4f7c-897e-928638961d80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" Apr 16 14:10:40.162045 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.161958 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d9kq\" (UniqueName: \"kubernetes.io/projected/3dfe1e36-48e4-4f7c-897e-928638961d80-kube-api-access-8d9kq\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc\" (UID: \"3dfe1e36-48e4-4f7c-897e-928638961d80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" Apr 16 14:10:40.180444 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.180415 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r"] Apr 16 14:10:40.184019 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:10:40.183989 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedea2ef2_f9c5_4e97_a326_4ef339743784.slice/crio-c9c3b06a7c82fd9acfddcaac39aa3c3b8fb244d0dba9f4639aeb0f1b380f972c WatchSource:0}: Error finding container c9c3b06a7c82fd9acfddcaac39aa3c3b8fb244d0dba9f4639aeb0f1b380f972c: Status 404 returned error can't find the container with id c9c3b06a7c82fd9acfddcaac39aa3c3b8fb244d0dba9f4639aeb0f1b380f972c Apr 16 14:10:40.220126 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.220093 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" Apr 16 14:10:40.249988 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.249963 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd"] Apr 16 14:10:40.251877 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:10:40.251846 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75da46c0_8d62_40ef_a683_87972dde559f.slice/crio-42e13729817e02b6ba5f48b0a7d9d797985601ee60db0e5192174816c061c930 WatchSource:0}: Error finding container 42e13729817e02b6ba5f48b0a7d9d797985601ee60db0e5192174816c061c930: Status 404 returned error can't find the container with id 42e13729817e02b6ba5f48b0a7d9d797985601ee60db0e5192174816c061c930 Apr 16 14:10:40.253289 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.253247 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zg6h\" (UniqueName: \"kubernetes.io/projected/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-kube-api-access-5zg6h\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4\" (UID: \"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" Apr 16 14:10:40.253483 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.253458 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4\" (UID: \"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" Apr 16 14:10:40.253609 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.253501 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4\" (UID: \"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" Apr 16 14:10:40.253834 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.253808 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4\" (UID: \"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" Apr 16 14:10:40.253897 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.253837 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4\" (UID: \"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" Apr 16 14:10:40.262546 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.262484 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zg6h\" (UniqueName: \"kubernetes.io/projected/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-kube-api-access-5zg6h\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4\" (UID: \"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" Apr 16 14:10:40.323103 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.323074 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" Apr 16 14:10:40.357304 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.357272 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc"] Apr 16 14:10:40.358350 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:10:40.358315 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dfe1e36_48e4_4f7c_897e_928638961d80.slice/crio-8e26f4c9597423c87deca7dd9c6b496599f3fea17e8bf8754e355cc73e6fa06a WatchSource:0}: Error finding container 8e26f4c9597423c87deca7dd9c6b496599f3fea17e8bf8754e355cc73e6fa06a: Status 404 returned error can't find the container with id 8e26f4c9597423c87deca7dd9c6b496599f3fea17e8bf8754e355cc73e6fa06a Apr 16 14:10:40.452398 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.452369 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4"] Apr 16 14:10:40.457049 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:10:40.457020 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1916a7e_f79c_4d33_bd13_0d2bdd40fd3f.slice/crio-92804797bf4ab0ce6790f6aca2dcdab9418b7b93db74e825fc97d2c9505fec68 WatchSource:0}: Error finding container 92804797bf4ab0ce6790f6aca2dcdab9418b7b93db74e825fc97d2c9505fec68: Status 404 returned error can't find the container with id 92804797bf4ab0ce6790f6aca2dcdab9418b7b93db74e825fc97d2c9505fec68 Apr 16 14:10:40.792431 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.792398 2569 generic.go:358] "Generic (PLEG): container finished" podID="75da46c0-8d62-40ef-a683-87972dde559f" containerID="863fa0f9c85e5c2acbde461e7cbb1998ad3be3c208422a244ee0c81ccb27b9fa" exitCode=0 Apr 16 14:10:40.792917 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.792475 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" event={"ID":"75da46c0-8d62-40ef-a683-87972dde559f","Type":"ContainerDied","Data":"863fa0f9c85e5c2acbde461e7cbb1998ad3be3c208422a244ee0c81ccb27b9fa"} Apr 16 14:10:40.792917 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.792502 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" event={"ID":"75da46c0-8d62-40ef-a683-87972dde559f","Type":"ContainerStarted","Data":"42e13729817e02b6ba5f48b0a7d9d797985601ee60db0e5192174816c061c930"} Apr 16 14:10:40.793821 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.793795 2569 generic.go:358] "Generic (PLEG): container finished" podID="b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f" containerID="63385cfefb15f045603c96d7e06badfe3933b4caed43a9663ca27e49c8900933" exitCode=0 Apr 16 14:10:40.793951 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.793878 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" event={"ID":"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f","Type":"ContainerDied","Data":"63385cfefb15f045603c96d7e06badfe3933b4caed43a9663ca27e49c8900933"} Apr 16 14:10:40.793951 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.793915 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" event={"ID":"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f","Type":"ContainerStarted","Data":"92804797bf4ab0ce6790f6aca2dcdab9418b7b93db74e825fc97d2c9505fec68"} Apr 16 14:10:40.795303 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.795282 2569 generic.go:358] "Generic (PLEG): container finished" podID="edea2ef2-f9c5-4e97-a326-4ef339743784" containerID="146e561b44412182d7eadba17093692c4f61d8ce26e7843b9d63415516b409fb" exitCode=0 Apr 16 14:10:40.795404 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.795352 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" event={"ID":"edea2ef2-f9c5-4e97-a326-4ef339743784","Type":"ContainerDied","Data":"146e561b44412182d7eadba17093692c4f61d8ce26e7843b9d63415516b409fb"} Apr 16 14:10:40.795404 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.795384 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" event={"ID":"edea2ef2-f9c5-4e97-a326-4ef339743784","Type":"ContainerStarted","Data":"c9c3b06a7c82fd9acfddcaac39aa3c3b8fb244d0dba9f4639aeb0f1b380f972c"} Apr 16 14:10:40.796728 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.796711 2569 generic.go:358] "Generic (PLEG): container finished" podID="3dfe1e36-48e4-4f7c-897e-928638961d80" containerID="b3dfa0a15720e559bea452253808b5930ed56b1ca40947635b491b34a63c7d42" exitCode=0 Apr 16 14:10:40.796807 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.796738 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" event={"ID":"3dfe1e36-48e4-4f7c-897e-928638961d80","Type":"ContainerDied","Data":"b3dfa0a15720e559bea452253808b5930ed56b1ca40947635b491b34a63c7d42"} Apr 16 14:10:40.796807 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:40.796754 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" event={"ID":"3dfe1e36-48e4-4f7c-897e-928638961d80","Type":"ContainerStarted","Data":"8e26f4c9597423c87deca7dd9c6b496599f3fea17e8bf8754e355cc73e6fa06a"} Apr 16 14:10:41.803185 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:41.803158 2569 generic.go:358] "Generic (PLEG): container finished" podID="3dfe1e36-48e4-4f7c-897e-928638961d80" containerID="77394f4e7303ba1a7ba73a33c92001f31e0083ce9d863a048bdf6b0f7d8966e2" exitCode=0 Apr 16 14:10:41.803581 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:41.803295 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" event={"ID":"3dfe1e36-48e4-4f7c-897e-928638961d80","Type":"ContainerDied","Data":"77394f4e7303ba1a7ba73a33c92001f31e0083ce9d863a048bdf6b0f7d8966e2"} Apr 16 14:10:42.808452 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:42.808415 2569 generic.go:358] "Generic (PLEG): container finished" podID="3dfe1e36-48e4-4f7c-897e-928638961d80" containerID="f67b3f37e2433cdcc4782085dbe038186040b74ded61c230e51212d4386f2af5" exitCode=0 Apr 16 14:10:42.808850 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:42.808499 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" event={"ID":"3dfe1e36-48e4-4f7c-897e-928638961d80","Type":"ContainerDied","Data":"f67b3f37e2433cdcc4782085dbe038186040b74ded61c230e51212d4386f2af5"} Apr 16 14:10:42.810143 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:42.810124 2569 generic.go:358] "Generic (PLEG): container finished" podID="75da46c0-8d62-40ef-a683-87972dde559f" containerID="9939902e7f04cbb1c918389d2af0a2614107a51b9c4d1983b506177292446db1" exitCode=0 Apr 16 14:10:42.810202 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:42.810185 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" event={"ID":"75da46c0-8d62-40ef-a683-87972dde559f","Type":"ContainerDied","Data":"9939902e7f04cbb1c918389d2af0a2614107a51b9c4d1983b506177292446db1"} Apr 16 14:10:42.811893 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:42.811845 2569 generic.go:358] "Generic (PLEG): container finished" podID="b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f" containerID="ef60277084d831200c910da15a200c6d320f183c0ff6990b5454ab43d31ef7d3" exitCode=0 Apr 16 14:10:42.811946 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:42.811924 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" event={"ID":"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f","Type":"ContainerDied","Data":"ef60277084d831200c910da15a200c6d320f183c0ff6990b5454ab43d31ef7d3"} Apr 16 14:10:42.813647 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:42.813631 2569 generic.go:358] "Generic (PLEG): container finished" podID="edea2ef2-f9c5-4e97-a326-4ef339743784" containerID="09886b29c45e4375d44d030a410ba1c8cc1eb46680658f35ea2594623c357df2" exitCode=0 Apr 16 14:10:42.813714 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:42.813680 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" event={"ID":"edea2ef2-f9c5-4e97-a326-4ef339743784","Type":"ContainerDied","Data":"09886b29c45e4375d44d030a410ba1c8cc1eb46680658f35ea2594623c357df2"} Apr 16 14:10:43.818985 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:43.818942 2569 generic.go:358] "Generic (PLEG): container finished" podID="75da46c0-8d62-40ef-a683-87972dde559f" containerID="03a415d58e9e9a267c5ff31f90acc541779135879df2645ccd91818996d0378c" exitCode=0 Apr 16 14:10:43.819463 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:43.818985 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" event={"ID":"75da46c0-8d62-40ef-a683-87972dde559f","Type":"ContainerDied","Data":"03a415d58e9e9a267c5ff31f90acc541779135879df2645ccd91818996d0378c"} Apr 16 14:10:43.820764 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:43.820741 2569 generic.go:358] "Generic (PLEG): container finished" podID="b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f" containerID="43f7b18e7ae479d9e2d186226f21ac42e3aa7f730101a4f068147ebf00fd8ccb" exitCode=0 Apr 16 14:10:43.820874 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:43.820790 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" event={"ID":"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f","Type":"ContainerDied","Data":"43f7b18e7ae479d9e2d186226f21ac42e3aa7f730101a4f068147ebf00fd8ccb"} Apr 16 14:10:43.822511 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:43.822485 2569 generic.go:358] "Generic (PLEG): container finished" podID="edea2ef2-f9c5-4e97-a326-4ef339743784" containerID="4e56c79d14d980a18c8c7715bb74ddf1eab37b3069e5842ede3a34de25e10a8d" exitCode=0 Apr 16 14:10:43.822604 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:43.822583 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" event={"ID":"edea2ef2-f9c5-4e97-a326-4ef339743784","Type":"ContainerDied","Data":"4e56c79d14d980a18c8c7715bb74ddf1eab37b3069e5842ede3a34de25e10a8d"} Apr 16 14:10:43.946152 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:43.946124 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" Apr 16 14:10:44.088945 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:44.088846 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d9kq\" (UniqueName: \"kubernetes.io/projected/3dfe1e36-48e4-4f7c-897e-928638961d80-kube-api-access-8d9kq\") pod \"3dfe1e36-48e4-4f7c-897e-928638961d80\" (UID: \"3dfe1e36-48e4-4f7c-897e-928638961d80\") " Apr 16 14:10:44.088945 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:44.088907 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3dfe1e36-48e4-4f7c-897e-928638961d80-bundle\") pod \"3dfe1e36-48e4-4f7c-897e-928638961d80\" (UID: \"3dfe1e36-48e4-4f7c-897e-928638961d80\") " Apr 16 14:10:44.088945 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:44.088940 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3dfe1e36-48e4-4f7c-897e-928638961d80-util\") pod \"3dfe1e36-48e4-4f7c-897e-928638961d80\" (UID: \"3dfe1e36-48e4-4f7c-897e-928638961d80\") " Apr 16 14:10:44.089414 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:44.089382 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dfe1e36-48e4-4f7c-897e-928638961d80-bundle" (OuterVolumeSpecName: "bundle") pod "3dfe1e36-48e4-4f7c-897e-928638961d80" (UID: "3dfe1e36-48e4-4f7c-897e-928638961d80"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:10:44.090933 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:44.090909 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dfe1e36-48e4-4f7c-897e-928638961d80-kube-api-access-8d9kq" (OuterVolumeSpecName: "kube-api-access-8d9kq") pod "3dfe1e36-48e4-4f7c-897e-928638961d80" (UID: "3dfe1e36-48e4-4f7c-897e-928638961d80"). InnerVolumeSpecName "kube-api-access-8d9kq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:10:44.095440 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:44.095414 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dfe1e36-48e4-4f7c-897e-928638961d80-util" (OuterVolumeSpecName: "util") pod "3dfe1e36-48e4-4f7c-897e-928638961d80" (UID: "3dfe1e36-48e4-4f7c-897e-928638961d80"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:10:44.190421 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:44.190386 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3dfe1e36-48e4-4f7c-897e-928638961d80-bundle\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:10:44.190421 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:44.190414 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3dfe1e36-48e4-4f7c-897e-928638961d80-util\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:10:44.190421 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:44.190423 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8d9kq\" (UniqueName: \"kubernetes.io/projected/3dfe1e36-48e4-4f7c-897e-928638961d80-kube-api-access-8d9kq\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:10:44.828346 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:44.828309 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" event={"ID":"3dfe1e36-48e4-4f7c-897e-928638961d80","Type":"ContainerDied","Data":"8e26f4c9597423c87deca7dd9c6b496599f3fea17e8bf8754e355cc73e6fa06a"} Apr 16 14:10:44.828346 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:44.828348 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e26f4c9597423c87deca7dd9c6b496599f3fea17e8bf8754e355cc73e6fa06a" Apr 16 14:10:44.828788 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:44.828323 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tkszc" Apr 16 14:10:44.981532 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:44.981504 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" Apr 16 14:10:45.009380 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.009352 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" Apr 16 14:10:45.012274 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.012234 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" Apr 16 14:10:45.098040 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.098014 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-bundle\") pod \"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f\" (UID: \"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f\") " Apr 16 14:10:45.098179 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.098049 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br9fw\" (UniqueName: \"kubernetes.io/projected/edea2ef2-f9c5-4e97-a326-4ef339743784-kube-api-access-br9fw\") pod \"edea2ef2-f9c5-4e97-a326-4ef339743784\" (UID: \"edea2ef2-f9c5-4e97-a326-4ef339743784\") " Apr 16 14:10:45.098179 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.098070 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c8hx\" (UniqueName: \"kubernetes.io/projected/75da46c0-8d62-40ef-a683-87972dde559f-kube-api-access-4c8hx\") pod \"75da46c0-8d62-40ef-a683-87972dde559f\" (UID: \"75da46c0-8d62-40ef-a683-87972dde559f\") " Apr 16 14:10:45.098179 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.098089 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zg6h\" (UniqueName: \"kubernetes.io/projected/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-kube-api-access-5zg6h\") pod \"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f\" (UID: \"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f\") " Apr 16 14:10:45.098179 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.098124 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75da46c0-8d62-40ef-a683-87972dde559f-bundle\") pod \"75da46c0-8d62-40ef-a683-87972dde559f\" (UID: \"75da46c0-8d62-40ef-a683-87972dde559f\") " Apr 16 14:10:45.098179 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.098155 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edea2ef2-f9c5-4e97-a326-4ef339743784-bundle\") pod \"edea2ef2-f9c5-4e97-a326-4ef339743784\" (UID: \"edea2ef2-f9c5-4e97-a326-4ef339743784\") " Apr 16 14:10:45.098179 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.098178 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75da46c0-8d62-40ef-a683-87972dde559f-util\") pod \"75da46c0-8d62-40ef-a683-87972dde559f\" (UID: \"75da46c0-8d62-40ef-a683-87972dde559f\") " Apr 16 14:10:45.098524 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.098224 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-util\") pod \"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f\" (UID: \"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f\") " Apr 16 14:10:45.098524 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.098240 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edea2ef2-f9c5-4e97-a326-4ef339743784-util\") pod \"edea2ef2-f9c5-4e97-a326-4ef339743784\" (UID: \"edea2ef2-f9c5-4e97-a326-4ef339743784\") " Apr 16 14:10:45.098847 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.098728 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edea2ef2-f9c5-4e97-a326-4ef339743784-bundle" (OuterVolumeSpecName: "bundle") pod "edea2ef2-f9c5-4e97-a326-4ef339743784" (UID: "edea2ef2-f9c5-4e97-a326-4ef339743784"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:10:45.098847 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.098788 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-bundle" (OuterVolumeSpecName: "bundle") pod "b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f" (UID: "b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:10:45.099427 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.099378 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75da46c0-8d62-40ef-a683-87972dde559f-bundle" (OuterVolumeSpecName: "bundle") pod "75da46c0-8d62-40ef-a683-87972dde559f" (UID: "75da46c0-8d62-40ef-a683-87972dde559f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:10:45.100504 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.100480 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edea2ef2-f9c5-4e97-a326-4ef339743784-kube-api-access-br9fw" (OuterVolumeSpecName: "kube-api-access-br9fw") pod "edea2ef2-f9c5-4e97-a326-4ef339743784" (UID: "edea2ef2-f9c5-4e97-a326-4ef339743784"). InnerVolumeSpecName "kube-api-access-br9fw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:10:45.100842 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.100814 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-kube-api-access-5zg6h" (OuterVolumeSpecName: "kube-api-access-5zg6h") pod "b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f" (UID: "b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f"). InnerVolumeSpecName "kube-api-access-5zg6h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:10:45.101016 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.100994 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75da46c0-8d62-40ef-a683-87972dde559f-kube-api-access-4c8hx" (OuterVolumeSpecName: "kube-api-access-4c8hx") pod "75da46c0-8d62-40ef-a683-87972dde559f" (UID: "75da46c0-8d62-40ef-a683-87972dde559f"). InnerVolumeSpecName "kube-api-access-4c8hx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:10:45.104811 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.104774 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75da46c0-8d62-40ef-a683-87972dde559f-util" (OuterVolumeSpecName: "util") pod "75da46c0-8d62-40ef-a683-87972dde559f" (UID: "75da46c0-8d62-40ef-a683-87972dde559f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:10:45.106796 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.106756 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edea2ef2-f9c5-4e97-a326-4ef339743784-util" (OuterVolumeSpecName: "util") pod "edea2ef2-f9c5-4e97-a326-4ef339743784" (UID: "edea2ef2-f9c5-4e97-a326-4ef339743784"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:10:45.106996 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.106965 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-util" (OuterVolumeSpecName: "util") pod "b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f" (UID: "b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:10:45.199860 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.199815 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edea2ef2-f9c5-4e97-a326-4ef339743784-bundle\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:10:45.199860 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.199850 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75da46c0-8d62-40ef-a683-87972dde559f-util\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:10:45.199860 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.199862 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-util\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:10:45.199860 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.199872 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edea2ef2-f9c5-4e97-a326-4ef339743784-util\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:10:45.200136 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.199883 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-bundle\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:10:45.200136 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.199895 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-br9fw\" (UniqueName: \"kubernetes.io/projected/edea2ef2-f9c5-4e97-a326-4ef339743784-kube-api-access-br9fw\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:10:45.200136 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.199910 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4c8hx\" (UniqueName: \"kubernetes.io/projected/75da46c0-8d62-40ef-a683-87972dde559f-kube-api-access-4c8hx\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:10:45.200136 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.199924 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5zg6h\" (UniqueName: \"kubernetes.io/projected/b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f-kube-api-access-5zg6h\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:10:45.200136 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.199936 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75da46c0-8d62-40ef-a683-87972dde559f-bundle\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:10:45.833274 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.833228 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" event={"ID":"75da46c0-8d62-40ef-a683-87972dde559f","Type":"ContainerDied","Data":"42e13729817e02b6ba5f48b0a7d9d797985601ee60db0e5192174816c061c930"} Apr 16 14:10:45.833274 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.833271 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503f7pxd" Apr 16 14:10:45.833776 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.833276 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42e13729817e02b6ba5f48b0a7d9d797985601ee60db0e5192174816c061c930" Apr 16 14:10:45.835008 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.834983 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" Apr 16 14:10:45.835008 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.834993 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30hpvq4" event={"ID":"b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f","Type":"ContainerDied","Data":"92804797bf4ab0ce6790f6aca2dcdab9418b7b93db74e825fc97d2c9505fec68"} Apr 16 14:10:45.835197 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.835022 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92804797bf4ab0ce6790f6aca2dcdab9418b7b93db74e825fc97d2c9505fec68" Apr 16 14:10:45.836926 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.836896 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" event={"ID":"edea2ef2-f9c5-4e97-a326-4ef339743784","Type":"ContainerDied","Data":"c9c3b06a7c82fd9acfddcaac39aa3c3b8fb244d0dba9f4639aeb0f1b380f972c"} Apr 16 14:10:45.836926 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.836923 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b29c9r" Apr 16 14:10:45.837123 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:45.836929 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9c3b06a7c82fd9acfddcaac39aa3c3b8fb244d0dba9f4639aeb0f1b380f972c" Apr 16 14:10:58.634905 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.634866 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-jvvfr"] Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635199 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edea2ef2-f9c5-4e97-a326-4ef339743784" containerName="pull" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635210 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="edea2ef2-f9c5-4e97-a326-4ef339743784" containerName="pull" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635221 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f" containerName="pull" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635226 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f" containerName="pull" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635236 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edea2ef2-f9c5-4e97-a326-4ef339743784" containerName="extract" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635244 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="edea2ef2-f9c5-4e97-a326-4ef339743784" containerName="extract" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635267 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dfe1e36-48e4-4f7c-897e-928638961d80" containerName="util" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635272 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dfe1e36-48e4-4f7c-897e-928638961d80" containerName="util" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635278 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f" containerName="util" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635282 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f" containerName="util" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635290 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75da46c0-8d62-40ef-a683-87972dde559f" containerName="util" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635294 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="75da46c0-8d62-40ef-a683-87972dde559f" containerName="util" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635305 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edea2ef2-f9c5-4e97-a326-4ef339743784" containerName="util" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635310 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="edea2ef2-f9c5-4e97-a326-4ef339743784" containerName="util" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635315 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dfe1e36-48e4-4f7c-897e-928638961d80" containerName="extract" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635319 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dfe1e36-48e4-4f7c-897e-928638961d80" containerName="extract" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635325 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f" containerName="extract" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635330 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f" containerName="extract" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635338 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dfe1e36-48e4-4f7c-897e-928638961d80" containerName="pull" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635342 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dfe1e36-48e4-4f7c-897e-928638961d80" containerName="pull" Apr 16 14:10:58.635339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635349 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75da46c0-8d62-40ef-a683-87972dde559f" containerName="extract" Apr 16 14:10:58.635966 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635354 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="75da46c0-8d62-40ef-a683-87972dde559f" containerName="extract" Apr 16 14:10:58.635966 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635362 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75da46c0-8d62-40ef-a683-87972dde559f" containerName="pull" Apr 16 14:10:58.635966 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635367 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="75da46c0-8d62-40ef-a683-87972dde559f" containerName="pull" Apr 16 14:10:58.635966 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635418 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1916a7e-f79c-4d33-bd13-0d2bdd40fd3f" containerName="extract" Apr 16 14:10:58.635966 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635429 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3dfe1e36-48e4-4f7c-897e-928638961d80" containerName="extract" Apr 16 14:10:58.635966 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635434 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="edea2ef2-f9c5-4e97-a326-4ef339743784" containerName="extract" Apr 16 14:10:58.635966 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.635442 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="75da46c0-8d62-40ef-a683-87972dde559f" containerName="extract" Apr 16 14:10:58.637527 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.637511 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jvvfr" Apr 16 14:10:58.641153 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.641129 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-rjxsn\"" Apr 16 14:10:58.641485 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.641465 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 14:10:58.642219 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.642203 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 14:10:58.642283 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.642210 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 14:10:58.651212 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.651188 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-jvvfr"] Apr 16 14:10:58.821545 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.821504 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57cwf\" (UniqueName: \"kubernetes.io/projected/4c9bd71c-1397-46a5-a6df-59c7835d9635-kube-api-access-57cwf\") pod \"dns-operator-controller-manager-844548ff4c-jvvfr\" (UID: \"4c9bd71c-1397-46a5-a6df-59c7835d9635\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jvvfr" Apr 16 14:10:58.922337 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.922304 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57cwf\" (UniqueName: \"kubernetes.io/projected/4c9bd71c-1397-46a5-a6df-59c7835d9635-kube-api-access-57cwf\") pod \"dns-operator-controller-manager-844548ff4c-jvvfr\" (UID: \"4c9bd71c-1397-46a5-a6df-59c7835d9635\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jvvfr" Apr 16 14:10:58.931966 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.931944 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57cwf\" (UniqueName: \"kubernetes.io/projected/4c9bd71c-1397-46a5-a6df-59c7835d9635-kube-api-access-57cwf\") pod \"dns-operator-controller-manager-844548ff4c-jvvfr\" (UID: \"4c9bd71c-1397-46a5-a6df-59c7835d9635\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jvvfr" Apr 16 14:10:58.947732 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:58.947703 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jvvfr" Apr 16 14:10:59.080585 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:59.080558 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-jvvfr"] Apr 16 14:10:59.082331 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:10:59.082298 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c9bd71c_1397_46a5_a6df_59c7835d9635.slice/crio-6ea509c8e88f6178f84c59cd457f5966fc56b55dd66d04e6b6b71b9e43389266 WatchSource:0}: Error finding container 6ea509c8e88f6178f84c59cd457f5966fc56b55dd66d04e6b6b71b9e43389266: Status 404 returned error can't find the container with id 6ea509c8e88f6178f84c59cd457f5966fc56b55dd66d04e6b6b71b9e43389266 Apr 16 14:10:59.887752 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:10:59.887711 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jvvfr" event={"ID":"4c9bd71c-1397-46a5-a6df-59c7835d9635","Type":"ContainerStarted","Data":"6ea509c8e88f6178f84c59cd457f5966fc56b55dd66d04e6b6b71b9e43389266"} Apr 16 14:11:01.896562 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:01.896473 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jvvfr" event={"ID":"4c9bd71c-1397-46a5-a6df-59c7835d9635","Type":"ContainerStarted","Data":"118e1027ad03916326ee92cb5f43795ff6afdf3fc8a17a02dbcc70564ed85119"} Apr 16 14:11:01.897060 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:01.896619 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jvvfr" Apr 16 14:11:01.922375 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:01.922318 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jvvfr" podStartSLOduration=1.376922454 podStartE2EDuration="3.922303704s" podCreationTimestamp="2026-04-16 14:10:58 +0000 UTC" firstStartedPulling="2026-04-16 14:10:59.08447936 +0000 UTC m=+694.092715592" lastFinishedPulling="2026-04-16 14:11:01.629860594 +0000 UTC m=+696.638096842" observedRunningTime="2026-04-16 14:11:01.921683819 +0000 UTC m=+696.929920073" watchObservedRunningTime="2026-04-16 14:11:01.922303704 +0000 UTC m=+696.930539957" Apr 16 14:11:03.311889 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.311853 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5"] Apr 16 14:11:03.315292 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.315273 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5" Apr 16 14:11:03.318027 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.318003 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 14:11:03.318158 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.318051 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 14:11:03.318158 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.318087 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9bhj5\"" Apr 16 14:11:03.325789 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.325766 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5"] Apr 16 14:11:03.358636 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.358605 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mvbv\" (UniqueName: \"kubernetes.io/projected/94ae3b08-04b1-494c-80f6-02db41384aca-kube-api-access-5mvbv\") pod \"kuadrant-console-plugin-6c886788f8-ss6p5\" (UID: \"94ae3b08-04b1-494c-80f6-02db41384aca\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5" Apr 16 14:11:03.358636 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.358651 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/94ae3b08-04b1-494c-80f6-02db41384aca-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ss6p5\" (UID: \"94ae3b08-04b1-494c-80f6-02db41384aca\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5" Apr 16 14:11:03.358831 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.358749 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/94ae3b08-04b1-494c-80f6-02db41384aca-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-ss6p5\" (UID: \"94ae3b08-04b1-494c-80f6-02db41384aca\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5" Apr 16 14:11:03.459362 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.459327 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mvbv\" (UniqueName: \"kubernetes.io/projected/94ae3b08-04b1-494c-80f6-02db41384aca-kube-api-access-5mvbv\") pod \"kuadrant-console-plugin-6c886788f8-ss6p5\" (UID: \"94ae3b08-04b1-494c-80f6-02db41384aca\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5" Apr 16 14:11:03.459540 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.459378 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/94ae3b08-04b1-494c-80f6-02db41384aca-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ss6p5\" (UID: \"94ae3b08-04b1-494c-80f6-02db41384aca\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5" Apr 16 14:11:03.459540 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.459435 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/94ae3b08-04b1-494c-80f6-02db41384aca-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-ss6p5\" (UID: \"94ae3b08-04b1-494c-80f6-02db41384aca\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5" Apr 16 14:11:03.460117 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.460098 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/94ae3b08-04b1-494c-80f6-02db41384aca-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-ss6p5\" (UID: \"94ae3b08-04b1-494c-80f6-02db41384aca\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5" Apr 16 14:11:03.461871 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.461854 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/94ae3b08-04b1-494c-80f6-02db41384aca-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ss6p5\" (UID: \"94ae3b08-04b1-494c-80f6-02db41384aca\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5" Apr 16 14:11:03.468678 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.468656 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mvbv\" (UniqueName: \"kubernetes.io/projected/94ae3b08-04b1-494c-80f6-02db41384aca-kube-api-access-5mvbv\") pod \"kuadrant-console-plugin-6c886788f8-ss6p5\" (UID: \"94ae3b08-04b1-494c-80f6-02db41384aca\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5" Apr 16 14:11:03.642297 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.642190 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5" Apr 16 14:11:03.766092 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.766052 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5"] Apr 16 14:11:03.770481 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:11:03.770446 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94ae3b08_04b1_494c_80f6_02db41384aca.slice/crio-fcf713300de981fdaf052398cb00f9add3188e5cfbdf46c62833ede7e22c07f4 WatchSource:0}: Error finding container fcf713300de981fdaf052398cb00f9add3188e5cfbdf46c62833ede7e22c07f4: Status 404 returned error can't find the container with id fcf713300de981fdaf052398cb00f9add3188e5cfbdf46c62833ede7e22c07f4 Apr 16 14:11:03.906009 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:03.905925 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5" event={"ID":"94ae3b08-04b1-494c-80f6-02db41384aca","Type":"ContainerStarted","Data":"fcf713300de981fdaf052398cb00f9add3188e5cfbdf46c62833ede7e22c07f4"} Apr 16 14:11:11.938005 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:11.937972 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5" event={"ID":"94ae3b08-04b1-494c-80f6-02db41384aca","Type":"ContainerStarted","Data":"269b02d32cec859a7ddc06525e487bdd0fba48dfc245af60414d933a11c35102"} Apr 16 14:11:11.955051 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:11.955001 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ss6p5" podStartSLOduration=1.7394132949999999 podStartE2EDuration="8.954984359s" podCreationTimestamp="2026-04-16 14:11:03 +0000 UTC" firstStartedPulling="2026-04-16 14:11:03.772071896 +0000 UTC m=+698.780308129" lastFinishedPulling="2026-04-16 14:11:10.987642958 +0000 UTC m=+705.995879193" observedRunningTime="2026-04-16 14:11:11.952919019 +0000 UTC m=+706.961155274" watchObservedRunningTime="2026-04-16 14:11:11.954984359 +0000 UTC m=+706.963220703" Apr 16 14:11:12.903465 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:12.903434 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jvvfr" Apr 16 14:11:45.858605 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:45.858573 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-jv82n"] Apr 16 14:11:45.884947 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:45.884914 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-jv82n"] Apr 16 14:11:45.885135 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:45.885053 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-jv82n" Apr 16 14:11:45.888042 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:45.888017 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-65tvt\"" Apr 16 14:11:46.041213 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:46.041174 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4kpn\" (UniqueName: \"kubernetes.io/projected/e13a13a9-cb25-4f56-9eda-a04ebfcd49c0-kube-api-access-t4kpn\") pod \"authorino-674b59b84c-jv82n\" (UID: \"e13a13a9-cb25-4f56-9eda-a04ebfcd49c0\") " pod="kuadrant-system/authorino-674b59b84c-jv82n" Apr 16 14:11:46.058134 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:46.058095 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-kj8m4"] Apr 16 14:11:46.061563 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:46.061544 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-kj8m4" Apr 16 14:11:46.068830 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:46.068794 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-kj8m4"] Apr 16 14:11:46.141862 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:46.141771 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4kpn\" (UniqueName: \"kubernetes.io/projected/e13a13a9-cb25-4f56-9eda-a04ebfcd49c0-kube-api-access-t4kpn\") pod \"authorino-674b59b84c-jv82n\" (UID: \"e13a13a9-cb25-4f56-9eda-a04ebfcd49c0\") " pod="kuadrant-system/authorino-674b59b84c-jv82n" Apr 16 14:11:46.149958 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:46.149920 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4kpn\" (UniqueName: \"kubernetes.io/projected/e13a13a9-cb25-4f56-9eda-a04ebfcd49c0-kube-api-access-t4kpn\") pod \"authorino-674b59b84c-jv82n\" (UID: \"e13a13a9-cb25-4f56-9eda-a04ebfcd49c0\") " pod="kuadrant-system/authorino-674b59b84c-jv82n" Apr 16 14:11:46.194839 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:46.194802 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-jv82n" Apr 16 14:11:46.242942 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:46.242900 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v27dn\" (UniqueName: \"kubernetes.io/projected/c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e-kube-api-access-v27dn\") pod \"authorino-79cbc94b89-kj8m4\" (UID: \"c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e\") " pod="kuadrant-system/authorino-79cbc94b89-kj8m4" Apr 16 14:11:46.321711 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:46.321676 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-jv82n"] Apr 16 14:11:46.322880 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:11:46.322852 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode13a13a9_cb25_4f56_9eda_a04ebfcd49c0.slice/crio-294f19752e474538bf445bd53a187a5e3b50c1b0ebc49092951fc46477c624f6 WatchSource:0}: Error finding container 294f19752e474538bf445bd53a187a5e3b50c1b0ebc49092951fc46477c624f6: Status 404 returned error can't find the container with id 294f19752e474538bf445bd53a187a5e3b50c1b0ebc49092951fc46477c624f6 Apr 16 14:11:46.324155 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:46.324138 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:11:46.344144 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:46.344111 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v27dn\" (UniqueName: \"kubernetes.io/projected/c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e-kube-api-access-v27dn\") pod \"authorino-79cbc94b89-kj8m4\" (UID: \"c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e\") " pod="kuadrant-system/authorino-79cbc94b89-kj8m4" Apr 16 14:11:46.352111 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:46.352077 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v27dn\" (UniqueName: \"kubernetes.io/projected/c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e-kube-api-access-v27dn\") pod \"authorino-79cbc94b89-kj8m4\" (UID: \"c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e\") " pod="kuadrant-system/authorino-79cbc94b89-kj8m4" Apr 16 14:11:46.372029 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:46.371992 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-kj8m4" Apr 16 14:11:46.495861 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:46.495834 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-kj8m4"] Apr 16 14:11:46.498180 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:11:46.498149 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6e400d4_d2c3_4e25_a5c0_bf1bb1605b9e.slice/crio-7b326d81596fd08d437ad2f4b824cdfa67653716c296cc0925298df040d74d3a WatchSource:0}: Error finding container 7b326d81596fd08d437ad2f4b824cdfa67653716c296cc0925298df040d74d3a: Status 404 returned error can't find the container with id 7b326d81596fd08d437ad2f4b824cdfa67653716c296cc0925298df040d74d3a Apr 16 14:11:47.071335 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:47.071289 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-kj8m4" event={"ID":"c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e","Type":"ContainerStarted","Data":"7b326d81596fd08d437ad2f4b824cdfa67653716c296cc0925298df040d74d3a"} Apr 16 14:11:47.072580 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:47.072553 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-jv82n" event={"ID":"e13a13a9-cb25-4f56-9eda-a04ebfcd49c0","Type":"ContainerStarted","Data":"294f19752e474538bf445bd53a187a5e3b50c1b0ebc49092951fc46477c624f6"} Apr 16 14:11:49.087648 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:49.087605 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-kj8m4" event={"ID":"c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e","Type":"ContainerStarted","Data":"01c6daf049ddc732e3ffc13d1ed849acf500e479ab3e5614d0840fe06d3bb6d7"} Apr 16 14:11:49.089265 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:49.089232 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-jv82n" event={"ID":"e13a13a9-cb25-4f56-9eda-a04ebfcd49c0","Type":"ContainerStarted","Data":"3a1c118ceabb7ae6b5100ae40b8ec61dac3ff157a8f8e5b22032ad5ebd5daa2b"} Apr 16 14:11:49.104313 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:49.104235 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-kj8m4" podStartSLOduration=0.613625756 podStartE2EDuration="3.104217349s" podCreationTimestamp="2026-04-16 14:11:46 +0000 UTC" firstStartedPulling="2026-04-16 14:11:46.499556157 +0000 UTC m=+741.507792390" lastFinishedPulling="2026-04-16 14:11:48.990147733 +0000 UTC m=+743.998383983" observedRunningTime="2026-04-16 14:11:49.10201014 +0000 UTC m=+744.110246396" watchObservedRunningTime="2026-04-16 14:11:49.104217349 +0000 UTC m=+744.112453602" Apr 16 14:11:49.120629 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:49.119141 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-jv82n" podStartSLOduration=1.5199084219999999 podStartE2EDuration="4.119121142s" podCreationTimestamp="2026-04-16 14:11:45 +0000 UTC" firstStartedPulling="2026-04-16 14:11:46.324283456 +0000 UTC m=+741.332519689" lastFinishedPulling="2026-04-16 14:11:48.923496176 +0000 UTC m=+743.931732409" observedRunningTime="2026-04-16 14:11:49.118749653 +0000 UTC m=+744.126985909" watchObservedRunningTime="2026-04-16 14:11:49.119121142 +0000 UTC m=+744.127357398" Apr 16 14:11:49.155174 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:49.155112 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-jv82n"] Apr 16 14:11:51.096067 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:51.096028 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-jv82n" podUID="e13a13a9-cb25-4f56-9eda-a04ebfcd49c0" containerName="authorino" containerID="cri-o://3a1c118ceabb7ae6b5100ae40b8ec61dac3ff157a8f8e5b22032ad5ebd5daa2b" gracePeriod=30 Apr 16 14:11:51.336199 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:51.336174 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-jv82n" Apr 16 14:11:51.394865 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:51.394771 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4kpn\" (UniqueName: \"kubernetes.io/projected/e13a13a9-cb25-4f56-9eda-a04ebfcd49c0-kube-api-access-t4kpn\") pod \"e13a13a9-cb25-4f56-9eda-a04ebfcd49c0\" (UID: \"e13a13a9-cb25-4f56-9eda-a04ebfcd49c0\") " Apr 16 14:11:51.396965 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:51.396932 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e13a13a9-cb25-4f56-9eda-a04ebfcd49c0-kube-api-access-t4kpn" (OuterVolumeSpecName: "kube-api-access-t4kpn") pod "e13a13a9-cb25-4f56-9eda-a04ebfcd49c0" (UID: "e13a13a9-cb25-4f56-9eda-a04ebfcd49c0"). InnerVolumeSpecName "kube-api-access-t4kpn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:11:51.495653 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:51.495616 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t4kpn\" (UniqueName: \"kubernetes.io/projected/e13a13a9-cb25-4f56-9eda-a04ebfcd49c0-kube-api-access-t4kpn\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:11:52.100331 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:52.100293 2569 generic.go:358] "Generic (PLEG): container finished" podID="e13a13a9-cb25-4f56-9eda-a04ebfcd49c0" containerID="3a1c118ceabb7ae6b5100ae40b8ec61dac3ff157a8f8e5b22032ad5ebd5daa2b" exitCode=0 Apr 16 14:11:52.100756 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:52.100343 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-jv82n" Apr 16 14:11:52.100756 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:52.100379 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-jv82n" event={"ID":"e13a13a9-cb25-4f56-9eda-a04ebfcd49c0","Type":"ContainerDied","Data":"3a1c118ceabb7ae6b5100ae40b8ec61dac3ff157a8f8e5b22032ad5ebd5daa2b"} Apr 16 14:11:52.100756 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:52.100416 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-jv82n" event={"ID":"e13a13a9-cb25-4f56-9eda-a04ebfcd49c0","Type":"ContainerDied","Data":"294f19752e474538bf445bd53a187a5e3b50c1b0ebc49092951fc46477c624f6"} Apr 16 14:11:52.100756 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:52.100435 2569 scope.go:117] "RemoveContainer" containerID="3a1c118ceabb7ae6b5100ae40b8ec61dac3ff157a8f8e5b22032ad5ebd5daa2b" Apr 16 14:11:52.109050 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:52.109031 2569 scope.go:117] "RemoveContainer" containerID="3a1c118ceabb7ae6b5100ae40b8ec61dac3ff157a8f8e5b22032ad5ebd5daa2b" Apr 16 14:11:52.109396 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:11:52.109367 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1c118ceabb7ae6b5100ae40b8ec61dac3ff157a8f8e5b22032ad5ebd5daa2b\": container with ID starting with 3a1c118ceabb7ae6b5100ae40b8ec61dac3ff157a8f8e5b22032ad5ebd5daa2b not found: ID does not exist" containerID="3a1c118ceabb7ae6b5100ae40b8ec61dac3ff157a8f8e5b22032ad5ebd5daa2b" Apr 16 14:11:52.109493 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:52.109404 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1c118ceabb7ae6b5100ae40b8ec61dac3ff157a8f8e5b22032ad5ebd5daa2b"} err="failed to get container status \"3a1c118ceabb7ae6b5100ae40b8ec61dac3ff157a8f8e5b22032ad5ebd5daa2b\": rpc error: code = NotFound desc = could not find container \"3a1c118ceabb7ae6b5100ae40b8ec61dac3ff157a8f8e5b22032ad5ebd5daa2b\": container with ID starting with 3a1c118ceabb7ae6b5100ae40b8ec61dac3ff157a8f8e5b22032ad5ebd5daa2b not found: ID does not exist" Apr 16 14:11:52.119819 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:52.119791 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-jv82n"] Apr 16 14:11:52.123736 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:52.123712 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-jv82n"] Apr 16 14:11:53.603437 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:11:53.603396 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e13a13a9-cb25-4f56-9eda-a04ebfcd49c0" path="/var/lib/kubelet/pods/e13a13a9-cb25-4f56-9eda-a04ebfcd49c0/volumes" Apr 16 14:12:10.492676 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:10.492637 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-6wqhq"] Apr 16 14:12:10.494071 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:10.494042 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e13a13a9-cb25-4f56-9eda-a04ebfcd49c0" containerName="authorino" Apr 16 14:12:10.494228 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:10.494217 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13a13a9-cb25-4f56-9eda-a04ebfcd49c0" containerName="authorino" Apr 16 14:12:10.494527 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:10.494512 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e13a13a9-cb25-4f56-9eda-a04ebfcd49c0" containerName="authorino" Apr 16 14:12:10.499476 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:10.499448 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-6wqhq" Apr 16 14:12:10.501984 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:10.501957 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-6wqhq"] Apr 16 14:12:10.502094 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:10.502052 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 14:12:10.552123 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:10.552091 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgdtm\" (UniqueName: \"kubernetes.io/projected/589a9481-338e-4fcb-9b1a-272930da4805-kube-api-access-xgdtm\") pod \"authorino-68bd676465-6wqhq\" (UID: \"589a9481-338e-4fcb-9b1a-272930da4805\") " pod="kuadrant-system/authorino-68bd676465-6wqhq" Apr 16 14:12:10.552299 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:10.552160 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/589a9481-338e-4fcb-9b1a-272930da4805-tls-cert\") pod \"authorino-68bd676465-6wqhq\" (UID: \"589a9481-338e-4fcb-9b1a-272930da4805\") " pod="kuadrant-system/authorino-68bd676465-6wqhq" Apr 16 14:12:10.653522 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:10.653485 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgdtm\" (UniqueName: \"kubernetes.io/projected/589a9481-338e-4fcb-9b1a-272930da4805-kube-api-access-xgdtm\") pod \"authorino-68bd676465-6wqhq\" (UID: \"589a9481-338e-4fcb-9b1a-272930da4805\") " pod="kuadrant-system/authorino-68bd676465-6wqhq" Apr 16 14:12:10.653722 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:10.653648 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/589a9481-338e-4fcb-9b1a-272930da4805-tls-cert\") pod \"authorino-68bd676465-6wqhq\" (UID: \"589a9481-338e-4fcb-9b1a-272930da4805\") " pod="kuadrant-system/authorino-68bd676465-6wqhq" Apr 16 14:12:10.656047 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:10.656021 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/589a9481-338e-4fcb-9b1a-272930da4805-tls-cert\") pod \"authorino-68bd676465-6wqhq\" (UID: \"589a9481-338e-4fcb-9b1a-272930da4805\") " pod="kuadrant-system/authorino-68bd676465-6wqhq" Apr 16 14:12:10.661672 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:10.661650 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgdtm\" (UniqueName: \"kubernetes.io/projected/589a9481-338e-4fcb-9b1a-272930da4805-kube-api-access-xgdtm\") pod \"authorino-68bd676465-6wqhq\" (UID: \"589a9481-338e-4fcb-9b1a-272930da4805\") " pod="kuadrant-system/authorino-68bd676465-6wqhq" Apr 16 14:12:10.809218 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:10.809134 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-6wqhq" Apr 16 14:12:10.934620 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:10.934588 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-6wqhq"] Apr 16 14:12:10.937641 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:12:10.937615 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod589a9481_338e_4fcb_9b1a_272930da4805.slice/crio-4dd28afa4c6b066c11fa698cf40ef8fcb539c10ead6201d8d24d61cc78422eda WatchSource:0}: Error finding container 4dd28afa4c6b066c11fa698cf40ef8fcb539c10ead6201d8d24d61cc78422eda: Status 404 returned error can't find the container with id 4dd28afa4c6b066c11fa698cf40ef8fcb539c10ead6201d8d24d61cc78422eda Apr 16 14:12:11.172026 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:11.171994 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-6wqhq" event={"ID":"589a9481-338e-4fcb-9b1a-272930da4805","Type":"ContainerStarted","Data":"4dd28afa4c6b066c11fa698cf40ef8fcb539c10ead6201d8d24d61cc78422eda"} Apr 16 14:12:12.177359 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:12.177326 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-6wqhq" event={"ID":"589a9481-338e-4fcb-9b1a-272930da4805","Type":"ContainerStarted","Data":"d27660a138748a6bbbf2ddeee1ca638cb16f95ba630001a30f694ba89666dbd4"} Apr 16 14:12:12.193949 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:12.193900 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-6wqhq" podStartSLOduration=1.6134544069999999 podStartE2EDuration="2.193887579s" podCreationTimestamp="2026-04-16 14:12:10 +0000 UTC" firstStartedPulling="2026-04-16 14:12:10.938986181 +0000 UTC m=+765.947222414" lastFinishedPulling="2026-04-16 14:12:11.519419339 +0000 UTC m=+766.527655586" observedRunningTime="2026-04-16 14:12:12.192227912 +0000 UTC m=+767.200464167" watchObservedRunningTime="2026-04-16 14:12:12.193887579 +0000 UTC m=+767.202123833" Apr 16 14:12:12.218096 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:12.218063 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-kj8m4"] Apr 16 14:12:12.218284 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:12.218246 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-kj8m4" podUID="c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e" containerName="authorino" containerID="cri-o://01c6daf049ddc732e3ffc13d1ed849acf500e479ab3e5614d0840fe06d3bb6d7" gracePeriod=30 Apr 16 14:12:12.463182 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:12.463154 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-kj8m4" Apr 16 14:12:12.569858 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:12.569820 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v27dn\" (UniqueName: \"kubernetes.io/projected/c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e-kube-api-access-v27dn\") pod \"c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e\" (UID: \"c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e\") " Apr 16 14:12:12.571918 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:12.571888 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e-kube-api-access-v27dn" (OuterVolumeSpecName: "kube-api-access-v27dn") pod "c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e" (UID: "c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e"). InnerVolumeSpecName "kube-api-access-v27dn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:12:12.670520 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:12.670485 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v27dn\" (UniqueName: \"kubernetes.io/projected/c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e-kube-api-access-v27dn\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:12:13.182393 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:13.182354 2569 generic.go:358] "Generic (PLEG): container finished" podID="c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e" containerID="01c6daf049ddc732e3ffc13d1ed849acf500e479ab3e5614d0840fe06d3bb6d7" exitCode=0 Apr 16 14:12:13.182823 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:13.182411 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-kj8m4" Apr 16 14:12:13.182823 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:13.182435 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-kj8m4" event={"ID":"c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e","Type":"ContainerDied","Data":"01c6daf049ddc732e3ffc13d1ed849acf500e479ab3e5614d0840fe06d3bb6d7"} Apr 16 14:12:13.182823 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:13.182469 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-kj8m4" event={"ID":"c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e","Type":"ContainerDied","Data":"7b326d81596fd08d437ad2f4b824cdfa67653716c296cc0925298df040d74d3a"} Apr 16 14:12:13.182823 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:13.182484 2569 scope.go:117] "RemoveContainer" containerID="01c6daf049ddc732e3ffc13d1ed849acf500e479ab3e5614d0840fe06d3bb6d7" Apr 16 14:12:13.191717 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:13.191336 2569 scope.go:117] "RemoveContainer" containerID="01c6daf049ddc732e3ffc13d1ed849acf500e479ab3e5614d0840fe06d3bb6d7" Apr 16 14:12:13.191717 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:12:13.191627 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01c6daf049ddc732e3ffc13d1ed849acf500e479ab3e5614d0840fe06d3bb6d7\": container with ID starting with 01c6daf049ddc732e3ffc13d1ed849acf500e479ab3e5614d0840fe06d3bb6d7 not found: ID does not exist" containerID="01c6daf049ddc732e3ffc13d1ed849acf500e479ab3e5614d0840fe06d3bb6d7" Apr 16 14:12:13.191717 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:13.191659 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01c6daf049ddc732e3ffc13d1ed849acf500e479ab3e5614d0840fe06d3bb6d7"} err="failed to get container status \"01c6daf049ddc732e3ffc13d1ed849acf500e479ab3e5614d0840fe06d3bb6d7\": rpc error: code = NotFound desc = could not find container \"01c6daf049ddc732e3ffc13d1ed849acf500e479ab3e5614d0840fe06d3bb6d7\": container with ID starting with 01c6daf049ddc732e3ffc13d1ed849acf500e479ab3e5614d0840fe06d3bb6d7 not found: ID does not exist" Apr 16 14:12:13.211873 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:13.211844 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-kj8m4"] Apr 16 14:12:13.214975 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:13.214950 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-kj8m4"] Apr 16 14:12:13.603510 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:13.603434 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e" path="/var/lib/kubelet/pods/c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e/volumes" Apr 16 14:12:29.538402 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.538367 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-82k5c"] Apr 16 14:12:29.538898 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.538877 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e" containerName="authorino" Apr 16 14:12:29.538971 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.538901 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e" containerName="authorino" Apr 16 14:12:29.539029 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.538975 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6e400d4-d2c3-4e25-a5c0-bf1bb1605b9e" containerName="authorino" Apr 16 14:12:29.546506 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.546481 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-82k5c" Apr 16 14:12:29.549328 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.549303 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 14:12:29.549465 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.549303 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-58mj8\"" Apr 16 14:12:29.549971 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.549935 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 14:12:29.549971 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.549942 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 14:12:29.551593 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.551569 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-82k5c"] Apr 16 14:12:29.623072 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.623027 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnlsj\" (UniqueName: \"kubernetes.io/projected/fb082a84-67c7-4c81-84cd-a32737d79ddc-kube-api-access-fnlsj\") pod \"seaweedfs-86cc847c5c-82k5c\" (UID: \"fb082a84-67c7-4c81-84cd-a32737d79ddc\") " pod="kserve/seaweedfs-86cc847c5c-82k5c" Apr 16 14:12:29.623279 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.623123 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fb082a84-67c7-4c81-84cd-a32737d79ddc-data\") pod \"seaweedfs-86cc847c5c-82k5c\" (UID: \"fb082a84-67c7-4c81-84cd-a32737d79ddc\") " pod="kserve/seaweedfs-86cc847c5c-82k5c" Apr 16 14:12:29.724627 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.724579 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fb082a84-67c7-4c81-84cd-a32737d79ddc-data\") pod \"seaweedfs-86cc847c5c-82k5c\" (UID: \"fb082a84-67c7-4c81-84cd-a32737d79ddc\") " pod="kserve/seaweedfs-86cc847c5c-82k5c" Apr 16 14:12:29.724830 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.724662 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnlsj\" (UniqueName: \"kubernetes.io/projected/fb082a84-67c7-4c81-84cd-a32737d79ddc-kube-api-access-fnlsj\") pod \"seaweedfs-86cc847c5c-82k5c\" (UID: \"fb082a84-67c7-4c81-84cd-a32737d79ddc\") " pod="kserve/seaweedfs-86cc847c5c-82k5c" Apr 16 14:12:29.724971 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.724948 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fb082a84-67c7-4c81-84cd-a32737d79ddc-data\") pod \"seaweedfs-86cc847c5c-82k5c\" (UID: \"fb082a84-67c7-4c81-84cd-a32737d79ddc\") " pod="kserve/seaweedfs-86cc847c5c-82k5c" Apr 16 14:12:29.734169 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.734132 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnlsj\" (UniqueName: \"kubernetes.io/projected/fb082a84-67c7-4c81-84cd-a32737d79ddc-kube-api-access-fnlsj\") pod \"seaweedfs-86cc847c5c-82k5c\" (UID: \"fb082a84-67c7-4c81-84cd-a32737d79ddc\") " pod="kserve/seaweedfs-86cc847c5c-82k5c" Apr 16 14:12:29.857929 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.857826 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-82k5c" Apr 16 14:12:29.990956 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:29.990929 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-82k5c"] Apr 16 14:12:29.992824 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:12:29.992794 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb082a84_67c7_4c81_84cd_a32737d79ddc.slice/crio-8ca07ac90e10376bedec5df76a9544099d170d9dd50c1d5e82ae7bcf25806c2d WatchSource:0}: Error finding container 8ca07ac90e10376bedec5df76a9544099d170d9dd50c1d5e82ae7bcf25806c2d: Status 404 returned error can't find the container with id 8ca07ac90e10376bedec5df76a9544099d170d9dd50c1d5e82ae7bcf25806c2d Apr 16 14:12:30.249140 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:30.249101 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-82k5c" event={"ID":"fb082a84-67c7-4c81-84cd-a32737d79ddc","Type":"ContainerStarted","Data":"8ca07ac90e10376bedec5df76a9544099d170d9dd50c1d5e82ae7bcf25806c2d"} Apr 16 14:12:33.262505 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:33.262467 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-82k5c" event={"ID":"fb082a84-67c7-4c81-84cd-a32737d79ddc","Type":"ContainerStarted","Data":"4619eed44fac61c260356cb8c5841b4f679925c4b335ae8e4fa5647d75100271"} Apr 16 14:12:33.262965 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:33.262601 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-82k5c" Apr 16 14:12:33.279352 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:33.279299 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-82k5c" podStartSLOduration=1.658159012 podStartE2EDuration="4.279285658s" podCreationTimestamp="2026-04-16 14:12:29 +0000 UTC" firstStartedPulling="2026-04-16 14:12:29.994243015 +0000 UTC m=+785.002479252" lastFinishedPulling="2026-04-16 14:12:32.615369665 +0000 UTC m=+787.623605898" observedRunningTime="2026-04-16 14:12:33.277272725 +0000 UTC m=+788.285508977" watchObservedRunningTime="2026-04-16 14:12:33.279285658 +0000 UTC m=+788.287521931" Apr 16 14:12:39.268333 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:12:39.268299 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-82k5c" Apr 16 14:13:57.648468 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:13:57.648432 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-pslh8"] Apr 16 14:13:57.651654 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:13:57.651636 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pslh8" Apr 16 14:13:57.669006 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:13:57.668982 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-pslh8"] Apr 16 14:13:57.676623 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:13:57.676595 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wln8d\" (UniqueName: \"kubernetes.io/projected/fe2d0489-0a39-4cfb-bc54-091169ee40ff-kube-api-access-wln8d\") pod \"s3-init-pslh8\" (UID: \"fe2d0489-0a39-4cfb-bc54-091169ee40ff\") " pod="kserve/s3-init-pslh8" Apr 16 14:13:57.777224 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:13:57.777181 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wln8d\" (UniqueName: \"kubernetes.io/projected/fe2d0489-0a39-4cfb-bc54-091169ee40ff-kube-api-access-wln8d\") pod \"s3-init-pslh8\" (UID: \"fe2d0489-0a39-4cfb-bc54-091169ee40ff\") " pod="kserve/s3-init-pslh8" Apr 16 14:13:57.787654 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:13:57.787623 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wln8d\" (UniqueName: \"kubernetes.io/projected/fe2d0489-0a39-4cfb-bc54-091169ee40ff-kube-api-access-wln8d\") pod \"s3-init-pslh8\" (UID: \"fe2d0489-0a39-4cfb-bc54-091169ee40ff\") " pod="kserve/s3-init-pslh8" Apr 16 14:13:57.960160 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:13:57.960120 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pslh8" Apr 16 14:13:58.085105 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:13:58.085071 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-pslh8"] Apr 16 14:13:58.086767 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:13:58.086734 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe2d0489_0a39_4cfb_bc54_091169ee40ff.slice/crio-2b19d86a3867e6905f280b7302d9e96e9b44d0200972b56f6d2bd72300d48f45 WatchSource:0}: Error finding container 2b19d86a3867e6905f280b7302d9e96e9b44d0200972b56f6d2bd72300d48f45: Status 404 returned error can't find the container with id 2b19d86a3867e6905f280b7302d9e96e9b44d0200972b56f6d2bd72300d48f45 Apr 16 14:13:58.588378 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:13:58.588312 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pslh8" event={"ID":"fe2d0489-0a39-4cfb-bc54-091169ee40ff","Type":"ContainerStarted","Data":"2b19d86a3867e6905f280b7302d9e96e9b44d0200972b56f6d2bd72300d48f45"} Apr 16 14:14:02.606347 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:02.606306 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pslh8" event={"ID":"fe2d0489-0a39-4cfb-bc54-091169ee40ff","Type":"ContainerStarted","Data":"5904cabe99b30d1a7f2caf748a103db317578a104717893a591188baf6b856d8"} Apr 16 14:14:02.624378 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:02.624327 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-pslh8" podStartSLOduration=1.1885798730000001 podStartE2EDuration="5.624313807s" podCreationTimestamp="2026-04-16 14:13:57 +0000 UTC" firstStartedPulling="2026-04-16 14:13:58.088475979 +0000 UTC m=+873.096712212" lastFinishedPulling="2026-04-16 14:14:02.524209899 +0000 UTC m=+877.532446146" observedRunningTime="2026-04-16 14:14:02.621677403 +0000 UTC m=+877.629913658" watchObservedRunningTime="2026-04-16 14:14:02.624313807 +0000 UTC m=+877.632550062" Apr 16 14:14:06.622867 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:06.622826 2569 generic.go:358] "Generic (PLEG): container finished" podID="fe2d0489-0a39-4cfb-bc54-091169ee40ff" containerID="5904cabe99b30d1a7f2caf748a103db317578a104717893a591188baf6b856d8" exitCode=0 Apr 16 14:14:06.622867 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:06.622870 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pslh8" event={"ID":"fe2d0489-0a39-4cfb-bc54-091169ee40ff","Type":"ContainerDied","Data":"5904cabe99b30d1a7f2caf748a103db317578a104717893a591188baf6b856d8"} Apr 16 14:14:07.760189 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:07.760160 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pslh8" Apr 16 14:14:07.873647 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:07.873603 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wln8d\" (UniqueName: \"kubernetes.io/projected/fe2d0489-0a39-4cfb-bc54-091169ee40ff-kube-api-access-wln8d\") pod \"fe2d0489-0a39-4cfb-bc54-091169ee40ff\" (UID: \"fe2d0489-0a39-4cfb-bc54-091169ee40ff\") " Apr 16 14:14:07.875853 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:07.875825 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe2d0489-0a39-4cfb-bc54-091169ee40ff-kube-api-access-wln8d" (OuterVolumeSpecName: "kube-api-access-wln8d") pod "fe2d0489-0a39-4cfb-bc54-091169ee40ff" (UID: "fe2d0489-0a39-4cfb-bc54-091169ee40ff"). InnerVolumeSpecName "kube-api-access-wln8d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:14:07.974638 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:07.974599 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wln8d\" (UniqueName: \"kubernetes.io/projected/fe2d0489-0a39-4cfb-bc54-091169ee40ff-kube-api-access-wln8d\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:14:08.631583 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:08.631550 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pslh8" Apr 16 14:14:08.631762 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:08.631582 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pslh8" event={"ID":"fe2d0489-0a39-4cfb-bc54-091169ee40ff","Type":"ContainerDied","Data":"2b19d86a3867e6905f280b7302d9e96e9b44d0200972b56f6d2bd72300d48f45"} Apr 16 14:14:08.631762 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:08.631616 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b19d86a3867e6905f280b7302d9e96e9b44d0200972b56f6d2bd72300d48f45" Apr 16 14:14:19.049972 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.049891 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf"] Apr 16 14:14:19.050345 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.050279 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe2d0489-0a39-4cfb-bc54-091169ee40ff" containerName="s3-init" Apr 16 14:14:19.050345 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.050291 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe2d0489-0a39-4cfb-bc54-091169ee40ff" containerName="s3-init" Apr 16 14:14:19.050412 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.050383 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe2d0489-0a39-4cfb-bc54-091169ee40ff" containerName="s3-init" Apr 16 14:14:19.053261 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.053235 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.055970 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.055943 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-tjn59\"" Apr 16 14:14:19.056236 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.056213 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 14:14:19.056236 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.056213 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:14:19.056490 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.056296 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:14:19.075025 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.074996 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf"] Apr 16 14:14:19.172228 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.172187 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9rvx\" (UniqueName: \"kubernetes.io/projected/e90a9704-cfe7-474f-aaaf-2b098a4430fe-kube-api-access-f9rvx\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.172446 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.172341 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e90a9704-cfe7-474f-aaaf-2b098a4430fe-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.172446 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.172382 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e90a9704-cfe7-474f-aaaf-2b098a4430fe-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.172446 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.172415 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e90a9704-cfe7-474f-aaaf-2b098a4430fe-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.172446 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.172444 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e90a9704-cfe7-474f-aaaf-2b098a4430fe-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.172621 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.172477 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e90a9704-cfe7-474f-aaaf-2b098a4430fe-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.172621 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.172539 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e90a9704-cfe7-474f-aaaf-2b098a4430fe-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.172717 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.172631 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e90a9704-cfe7-474f-aaaf-2b098a4430fe-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.172717 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.172651 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e90a9704-cfe7-474f-aaaf-2b098a4430fe-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.273097 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.273059 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e90a9704-cfe7-474f-aaaf-2b098a4430fe-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.273097 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.273099 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e90a9704-cfe7-474f-aaaf-2b098a4430fe-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.273347 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.273182 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9rvx\" (UniqueName: \"kubernetes.io/projected/e90a9704-cfe7-474f-aaaf-2b098a4430fe-kube-api-access-f9rvx\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.273347 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.273242 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e90a9704-cfe7-474f-aaaf-2b098a4430fe-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.273347 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.273297 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e90a9704-cfe7-474f-aaaf-2b098a4430fe-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.273347 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.273330 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e90a9704-cfe7-474f-aaaf-2b098a4430fe-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.273546 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.273360 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e90a9704-cfe7-474f-aaaf-2b098a4430fe-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.273546 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.273396 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e90a9704-cfe7-474f-aaaf-2b098a4430fe-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.273546 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.273427 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e90a9704-cfe7-474f-aaaf-2b098a4430fe-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.274053 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.273725 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e90a9704-cfe7-474f-aaaf-2b098a4430fe-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.274053 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.273747 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e90a9704-cfe7-474f-aaaf-2b098a4430fe-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.274053 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.273862 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e90a9704-cfe7-474f-aaaf-2b098a4430fe-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.274238 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.274081 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e90a9704-cfe7-474f-aaaf-2b098a4430fe-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.274238 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.274148 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e90a9704-cfe7-474f-aaaf-2b098a4430fe-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.275682 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.275664 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e90a9704-cfe7-474f-aaaf-2b098a4430fe-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.275856 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.275836 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e90a9704-cfe7-474f-aaaf-2b098a4430fe-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.282837 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.282805 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e90a9704-cfe7-474f-aaaf-2b098a4430fe-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.283014 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.282993 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9rvx\" (UniqueName: \"kubernetes.io/projected/e90a9704-cfe7-474f-aaaf-2b098a4430fe-kube-api-access-f9rvx\") pod \"router-gateway-1-openshift-default-6c59fbf55c-zkrpf\" (UID: \"e90a9704-cfe7-474f-aaaf-2b098a4430fe\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.366853 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.366766 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:19.495806 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.495777 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf"] Apr 16 14:14:19.497562 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:14:19.497530 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode90a9704_cfe7_474f_aaaf_2b098a4430fe.slice/crio-a1f19d2666b26c125635878be81161f74ccf005c1bb7b02a32fce93ac2d5d936 WatchSource:0}: Error finding container a1f19d2666b26c125635878be81161f74ccf005c1bb7b02a32fce93ac2d5d936: Status 404 returned error can't find the container with id a1f19d2666b26c125635878be81161f74ccf005c1bb7b02a32fce93ac2d5d936 Apr 16 14:14:19.671245 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:19.671206 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" event={"ID":"e90a9704-cfe7-474f-aaaf-2b098a4430fe","Type":"ContainerStarted","Data":"a1f19d2666b26c125635878be81161f74ccf005c1bb7b02a32fce93ac2d5d936"} Apr 16 14:14:21.893623 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:21.893587 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 14:14:21.893886 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:21.893664 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 14:14:21.893886 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:21.893691 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 14:14:22.686210 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:22.686177 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" event={"ID":"e90a9704-cfe7-474f-aaaf-2b098a4430fe","Type":"ContainerStarted","Data":"2af655655efa20afb93715eb94f831af01bf89d236ff361066cad827f73d2b15"} Apr 16 14:14:22.708504 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:22.708452 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" podStartSLOduration=1.314479016 podStartE2EDuration="3.708438483s" podCreationTimestamp="2026-04-16 14:14:19 +0000 UTC" firstStartedPulling="2026-04-16 14:14:19.49940919 +0000 UTC m=+894.507645423" lastFinishedPulling="2026-04-16 14:14:21.893368645 +0000 UTC m=+896.901604890" observedRunningTime="2026-04-16 14:14:22.706648653 +0000 UTC m=+897.714884947" watchObservedRunningTime="2026-04-16 14:14:22.708438483 +0000 UTC m=+897.716674738" Apr 16 14:14:23.367834 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:23.367795 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:23.373058 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:23.373031 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:23.690067 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:23.690033 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:23.691283 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:23.691241 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-zkrpf" Apr 16 14:14:25.539903 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:25.539873 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:14:25.540361 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:25.540113 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:14:48.338539 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.338504 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8"] Apr 16 14:14:48.372272 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.372222 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8"] Apr 16 14:14:48.372437 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.372359 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.375536 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.375508 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xs8cb\"" Apr 16 14:14:48.376479 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.376458 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 14:14:48.429998 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.429942 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.429998 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.430007 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm5vk\" (UniqueName: \"kubernetes.io/projected/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-kube-api-access-cm5vk\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.430239 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.430029 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.430239 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.430067 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-home\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.430239 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.430090 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-dshm\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.430239 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.430150 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-model-cache\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.531542 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.531506 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-dshm\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.531741 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.531549 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-model-cache\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.531741 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.531588 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.531741 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.531703 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cm5vk\" (UniqueName: \"kubernetes.io/projected/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-kube-api-access-cm5vk\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.531918 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.531742 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.531918 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.531821 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-home\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.532023 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.531959 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.532023 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.532012 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-model-cache\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.532147 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.532128 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-home\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.533789 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.533764 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-dshm\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.534055 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.534036 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.540868 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.540848 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm5vk\" (UniqueName: \"kubernetes.io/projected/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-kube-api-access-cm5vk\") pod \"scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.683478 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.683440 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:14:48.815502 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:48.815474 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8"] Apr 16 14:14:48.817047 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:14:48.817018 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d4e3e8d_d506_4981_baaa_5cf335b9fc2c.slice/crio-480ba3f09542ff6f1e7ebd943690f1130edd509a76a9ac69aa75a11a80feebeb WatchSource:0}: Error finding container 480ba3f09542ff6f1e7ebd943690f1130edd509a76a9ac69aa75a11a80feebeb: Status 404 returned error can't find the container with id 480ba3f09542ff6f1e7ebd943690f1130edd509a76a9ac69aa75a11a80feebeb Apr 16 14:14:49.795082 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:49.795042 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" event={"ID":"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c","Type":"ContainerStarted","Data":"480ba3f09542ff6f1e7ebd943690f1130edd509a76a9ac69aa75a11a80feebeb"} Apr 16 14:14:53.814724 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:53.814684 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" event={"ID":"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c","Type":"ContainerStarted","Data":"f919846707a90e13650f599e7835a04d3048847de3fd8b994e77bd9e10f414a4"} Apr 16 14:14:57.831799 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:57.831766 2569 generic.go:358] "Generic (PLEG): container finished" podID="9d4e3e8d-d506-4981-baaa-5cf335b9fc2c" containerID="f919846707a90e13650f599e7835a04d3048847de3fd8b994e77bd9e10f414a4" exitCode=0 Apr 16 14:14:57.832177 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:57.831843 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" event={"ID":"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c","Type":"ContainerDied","Data":"f919846707a90e13650f599e7835a04d3048847de3fd8b994e77bd9e10f414a4"} Apr 16 14:14:59.841546 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:59.841510 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" event={"ID":"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c","Type":"ContainerStarted","Data":"2bd198fd63539d6d3cd5a86c78d7d45984565e9692e5e04c6435ee4d05e4974a"} Apr 16 14:14:59.862917 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:14:59.862863 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" podStartSLOduration=1.563725856 podStartE2EDuration="11.862847243s" podCreationTimestamp="2026-04-16 14:14:48 +0000 UTC" firstStartedPulling="2026-04-16 14:14:48.819347618 +0000 UTC m=+923.827583856" lastFinishedPulling="2026-04-16 14:14:59.118468998 +0000 UTC m=+934.126705243" observedRunningTime="2026-04-16 14:14:59.85937059 +0000 UTC m=+934.867606838" watchObservedRunningTime="2026-04-16 14:14:59.862847243 +0000 UTC m=+934.871083498" Apr 16 14:15:07.438658 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.438624 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c"] Apr 16 14:15:07.481439 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.481389 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c"] Apr 16 14:15:07.481616 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.481480 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.484324 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.484305 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 16 14:15:07.610204 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.610166 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.610419 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.610278 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.610419 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.610335 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.610533 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.610471 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.610533 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.610518 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c1c83d78-d385-47ec-b767-faa11cfd758b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.610616 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.610587 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbc6h\" (UniqueName: \"kubernetes.io/projected/c1c83d78-d385-47ec-b767-faa11cfd758b-kube-api-access-zbc6h\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.711319 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.711221 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbc6h\" (UniqueName: \"kubernetes.io/projected/c1c83d78-d385-47ec-b767-faa11cfd758b-kube-api-access-zbc6h\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.711490 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.711323 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.711490 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.711371 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.711603 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.711523 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.711603 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.711583 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.711719 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.711613 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c1c83d78-d385-47ec-b767-faa11cfd758b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.711823 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.711801 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.711934 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.711911 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.712029 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.712008 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.713620 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.713591 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.714062 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.714039 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c1c83d78-d385-47ec-b767-faa11cfd758b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.721291 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.721264 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbc6h\" (UniqueName: \"kubernetes.io/projected/c1c83d78-d385-47ec-b767-faa11cfd758b-kube-api-access-zbc6h\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.792123 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.792080 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:07.935911 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:07.935876 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c"] Apr 16 14:15:07.937922 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:15:07.937896 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1c83d78_d385_47ec_b767_faa11cfd758b.slice/crio-c4c739734f4d83dc8d233622a7f4545c09b6b92357eff9aa8ec6c5c7f1cdca3d WatchSource:0}: Error finding container c4c739734f4d83dc8d233622a7f4545c09b6b92357eff9aa8ec6c5c7f1cdca3d: Status 404 returned error can't find the container with id c4c739734f4d83dc8d233622a7f4545c09b6b92357eff9aa8ec6c5c7f1cdca3d Apr 16 14:15:08.684318 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:08.684273 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:15:08.684318 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:08.684326 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:15:08.708508 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:08.708467 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:15:08.877993 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:08.877952 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" event={"ID":"c1c83d78-d385-47ec-b767-faa11cfd758b","Type":"ContainerStarted","Data":"74764e0d490a944e8511f3f95a60cb8ef5c44c7a08bc109e25a4a432b6a76753"} Apr 16 14:15:08.877993 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:08.877994 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" event={"ID":"c1c83d78-d385-47ec-b767-faa11cfd758b","Type":"ContainerStarted","Data":"c4c739734f4d83dc8d233622a7f4545c09b6b92357eff9aa8ec6c5c7f1cdca3d"} Apr 16 14:15:08.909279 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:08.908716 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:15:12.900986 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:12.900950 2569 generic.go:358] "Generic (PLEG): container finished" podID="c1c83d78-d385-47ec-b767-faa11cfd758b" containerID="74764e0d490a944e8511f3f95a60cb8ef5c44c7a08bc109e25a4a432b6a76753" exitCode=0 Apr 16 14:15:12.901490 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:12.901029 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" event={"ID":"c1c83d78-d385-47ec-b767-faa11cfd758b","Type":"ContainerDied","Data":"74764e0d490a944e8511f3f95a60cb8ef5c44c7a08bc109e25a4a432b6a76753"} Apr 16 14:15:13.906554 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:13.906524 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" event={"ID":"c1c83d78-d385-47ec-b767-faa11cfd758b","Type":"ContainerStarted","Data":"b2d2030c448e64d049b8df9bad559e86c7532bbe0364a3afe0c1e47d05fd5e16"} Apr 16 14:15:13.927334 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:13.927283 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" podStartSLOduration=6.927268891 podStartE2EDuration="6.927268891s" podCreationTimestamp="2026-04-16 14:15:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:15:13.925494656 +0000 UTC m=+948.933730913" watchObservedRunningTime="2026-04-16 14:15:13.927268891 +0000 UTC m=+948.935505137" Apr 16 14:15:17.792439 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:17.792397 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:17.792838 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:17.792748 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:17.804860 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:17.804837 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:17.935375 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:17.935343 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:15:43.501378 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.501343 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8"] Apr 16 14:15:43.502166 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.501621 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" podUID="9d4e3e8d-d506-4981-baaa-5cf335b9fc2c" containerName="main" containerID="cri-o://2bd198fd63539d6d3cd5a86c78d7d45984565e9692e5e04c6435ee4d05e4974a" gracePeriod=30 Apr 16 14:15:43.749936 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.749912 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:15:43.847180 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.847085 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-kserve-provision-location\") pod \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " Apr 16 14:15:43.847180 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.847119 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-home\") pod \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " Apr 16 14:15:43.847180 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.847143 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-model-cache\") pod \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " Apr 16 14:15:43.847513 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.847195 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-dshm\") pod \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " Apr 16 14:15:43.847513 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.847275 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-tls-certs\") pod \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " Apr 16 14:15:43.847513 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.847302 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm5vk\" (UniqueName: \"kubernetes.io/projected/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-kube-api-access-cm5vk\") pod \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\" (UID: \"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c\") " Apr 16 14:15:43.847513 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.847407 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-home" (OuterVolumeSpecName: "home") pod "9d4e3e8d-d506-4981-baaa-5cf335b9fc2c" (UID: "9d4e3e8d-d506-4981-baaa-5cf335b9fc2c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:43.847722 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.847605 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-home\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:15:43.847722 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.847603 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-model-cache" (OuterVolumeSpecName: "model-cache") pod "9d4e3e8d-d506-4981-baaa-5cf335b9fc2c" (UID: "9d4e3e8d-d506-4981-baaa-5cf335b9fc2c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:43.849754 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.849714 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-dshm" (OuterVolumeSpecName: "dshm") pod "9d4e3e8d-d506-4981-baaa-5cf335b9fc2c" (UID: "9d4e3e8d-d506-4981-baaa-5cf335b9fc2c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:43.849864 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.849824 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9d4e3e8d-d506-4981-baaa-5cf335b9fc2c" (UID: "9d4e3e8d-d506-4981-baaa-5cf335b9fc2c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:15:43.849864 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.849848 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-kube-api-access-cm5vk" (OuterVolumeSpecName: "kube-api-access-cm5vk") pod "9d4e3e8d-d506-4981-baaa-5cf335b9fc2c" (UID: "9d4e3e8d-d506-4981-baaa-5cf335b9fc2c"). InnerVolumeSpecName "kube-api-access-cm5vk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:15:43.902036 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.901996 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9d4e3e8d-d506-4981-baaa-5cf335b9fc2c" (UID: "9d4e3e8d-d506-4981-baaa-5cf335b9fc2c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:43.948199 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.948162 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-tls-certs\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:15:43.948199 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.948197 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cm5vk\" (UniqueName: \"kubernetes.io/projected/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-kube-api-access-cm5vk\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:15:43.948367 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.948207 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-kserve-provision-location\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:15:43.948367 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.948218 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-model-cache\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:15:43.948367 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:43.948227 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c-dshm\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:15:44.024169 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:44.024132 2569 generic.go:358] "Generic (PLEG): container finished" podID="9d4e3e8d-d506-4981-baaa-5cf335b9fc2c" containerID="2bd198fd63539d6d3cd5a86c78d7d45984565e9692e5e04c6435ee4d05e4974a" exitCode=0 Apr 16 14:15:44.024352 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:44.024213 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" Apr 16 14:15:44.024352 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:44.024215 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" event={"ID":"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c","Type":"ContainerDied","Data":"2bd198fd63539d6d3cd5a86c78d7d45984565e9692e5e04c6435ee4d05e4974a"} Apr 16 14:15:44.024352 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:44.024279 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8" event={"ID":"9d4e3e8d-d506-4981-baaa-5cf335b9fc2c","Type":"ContainerDied","Data":"480ba3f09542ff6f1e7ebd943690f1130edd509a76a9ac69aa75a11a80feebeb"} Apr 16 14:15:44.024352 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:44.024300 2569 scope.go:117] "RemoveContainer" containerID="2bd198fd63539d6d3cd5a86c78d7d45984565e9692e5e04c6435ee4d05e4974a" Apr 16 14:15:44.033561 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:44.033540 2569 scope.go:117] "RemoveContainer" containerID="f919846707a90e13650f599e7835a04d3048847de3fd8b994e77bd9e10f414a4" Apr 16 14:15:44.044467 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:44.044441 2569 scope.go:117] "RemoveContainer" containerID="2bd198fd63539d6d3cd5a86c78d7d45984565e9692e5e04c6435ee4d05e4974a" Apr 16 14:15:44.044758 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:15:44.044736 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bd198fd63539d6d3cd5a86c78d7d45984565e9692e5e04c6435ee4d05e4974a\": container with ID starting with 2bd198fd63539d6d3cd5a86c78d7d45984565e9692e5e04c6435ee4d05e4974a not found: ID does not exist" containerID="2bd198fd63539d6d3cd5a86c78d7d45984565e9692e5e04c6435ee4d05e4974a" Apr 16 14:15:44.044836 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:44.044766 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bd198fd63539d6d3cd5a86c78d7d45984565e9692e5e04c6435ee4d05e4974a"} err="failed to get container status \"2bd198fd63539d6d3cd5a86c78d7d45984565e9692e5e04c6435ee4d05e4974a\": rpc error: code = NotFound desc = could not find container \"2bd198fd63539d6d3cd5a86c78d7d45984565e9692e5e04c6435ee4d05e4974a\": container with ID starting with 2bd198fd63539d6d3cd5a86c78d7d45984565e9692e5e04c6435ee4d05e4974a not found: ID does not exist" Apr 16 14:15:44.044836 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:44.044783 2569 scope.go:117] "RemoveContainer" containerID="f919846707a90e13650f599e7835a04d3048847de3fd8b994e77bd9e10f414a4" Apr 16 14:15:44.045049 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:15:44.045026 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f919846707a90e13650f599e7835a04d3048847de3fd8b994e77bd9e10f414a4\": container with ID starting with f919846707a90e13650f599e7835a04d3048847de3fd8b994e77bd9e10f414a4 not found: ID does not exist" containerID="f919846707a90e13650f599e7835a04d3048847de3fd8b994e77bd9e10f414a4" Apr 16 14:15:44.045115 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:44.045057 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f919846707a90e13650f599e7835a04d3048847de3fd8b994e77bd9e10f414a4"} err="failed to get container status \"f919846707a90e13650f599e7835a04d3048847de3fd8b994e77bd9e10f414a4\": rpc error: code = NotFound desc = could not find container \"f919846707a90e13650f599e7835a04d3048847de3fd8b994e77bd9e10f414a4\": container with ID starting with f919846707a90e13650f599e7835a04d3048847de3fd8b994e77bd9e10f414a4 not found: ID does not exist" Apr 16 14:15:44.046757 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:44.046735 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8"] Apr 16 14:15:44.052508 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:44.052487 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-57d5df87b-vdzf8"] Apr 16 14:15:45.603909 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:45.603866 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4e3e8d-d506-4981-baaa-5cf335b9fc2c" path="/var/lib/kubelet/pods/9d4e3e8d-d506-4981-baaa-5cf335b9fc2c/volumes" Apr 16 14:15:50.906978 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:50.906934 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h"] Apr 16 14:15:50.907559 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:50.907520 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d4e3e8d-d506-4981-baaa-5cf335b9fc2c" containerName="storage-initializer" Apr 16 14:15:50.907559 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:50.907540 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4e3e8d-d506-4981-baaa-5cf335b9fc2c" containerName="storage-initializer" Apr 16 14:15:50.907686 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:50.907568 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d4e3e8d-d506-4981-baaa-5cf335b9fc2c" containerName="main" Apr 16 14:15:50.907686 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:50.907576 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4e3e8d-d506-4981-baaa-5cf335b9fc2c" containerName="main" Apr 16 14:15:50.907686 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:50.907668 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d4e3e8d-d506-4981-baaa-5cf335b9fc2c" containerName="main" Apr 16 14:15:50.910998 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:50.910977 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:50.913771 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:50.913746 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 14:15:50.913906 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:50.913746 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-4l6l9\"" Apr 16 14:15:50.923169 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:50.923146 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h"] Apr 16 14:15:51.013510 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.013472 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.013690 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.013546 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.013690 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.013580 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.013690 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.013611 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8dh4\" (UniqueName: \"kubernetes.io/projected/92c10ec2-fce8-4723-9b67-054bd0ed4a28-kube-api-access-r8dh4\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.013802 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.013685 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.013802 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.013744 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.115158 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.115110 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.115404 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.115166 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.115404 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.115236 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.115404 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.115296 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.115404 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.115335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8dh4\" (UniqueName: \"kubernetes.io/projected/92c10ec2-fce8-4723-9b67-054bd0ed4a28-kube-api-access-r8dh4\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.115404 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.115368 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.115754 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.115551 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.115754 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.115621 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.115754 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.115659 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.115754 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.115749 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.117847 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.117825 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.123424 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.123395 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8dh4\" (UniqueName: \"kubernetes.io/projected/92c10ec2-fce8-4723-9b67-054bd0ed4a28-kube-api-access-r8dh4\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.221682 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.221645 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:15:51.360386 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:51.360360 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h"] Apr 16 14:15:51.361728 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:15:51.361698 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92c10ec2_fce8_4723_9b67_054bd0ed4a28.slice/crio-6a9d4e896069609a2d9116f66572285e277494319097a5ae53848bd26f1ed5d2 WatchSource:0}: Error finding container 6a9d4e896069609a2d9116f66572285e277494319097a5ae53848bd26f1ed5d2: Status 404 returned error can't find the container with id 6a9d4e896069609a2d9116f66572285e277494319097a5ae53848bd26f1ed5d2 Apr 16 14:15:52.057720 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:52.057681 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" event={"ID":"92c10ec2-fce8-4723-9b67-054bd0ed4a28","Type":"ContainerStarted","Data":"3e093accb662f682e26ecb78c180fb6753e64f2c5dffc5b6ac862d830a885f21"} Apr 16 14:15:52.057720 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:52.057722 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" event={"ID":"92c10ec2-fce8-4723-9b67-054bd0ed4a28","Type":"ContainerStarted","Data":"6a9d4e896069609a2d9116f66572285e277494319097a5ae53848bd26f1ed5d2"} Apr 16 14:15:53.067429 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:53.067352 2569 generic.go:358] "Generic (PLEG): container finished" podID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerID="3e093accb662f682e26ecb78c180fb6753e64f2c5dffc5b6ac862d830a885f21" exitCode=0 Apr 16 14:15:53.067855 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:53.067446 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" event={"ID":"92c10ec2-fce8-4723-9b67-054bd0ed4a28","Type":"ContainerDied","Data":"3e093accb662f682e26ecb78c180fb6753e64f2c5dffc5b6ac862d830a885f21"} Apr 16 14:15:55.079221 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:15:55.079180 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" event={"ID":"92c10ec2-fce8-4723-9b67-054bd0ed4a28","Type":"ContainerStarted","Data":"0efe1cdc955fa301341ad072ecec3f61eec838b8f55a842680d4bd9535239a77"} Apr 16 14:16:11.054953 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:11.054914 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c"] Apr 16 14:16:11.055643 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:11.055611 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" podUID="c1c83d78-d385-47ec-b767-faa11cfd758b" containerName="main" containerID="cri-o://b2d2030c448e64d049b8df9bad559e86c7532bbe0364a3afe0c1e47d05fd5e16" gracePeriod=30 Apr 16 14:16:17.925193 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:17.925143 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" podUID="c1c83d78-d385-47ec-b767-faa11cfd758b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 16 14:16:23.206092 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.206054 2569 generic.go:358] "Generic (PLEG): container finished" podID="c1c83d78-d385-47ec-b767-faa11cfd758b" containerID="b2d2030c448e64d049b8df9bad559e86c7532bbe0364a3afe0c1e47d05fd5e16" exitCode=0 Apr 16 14:16:23.206476 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.206125 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" event={"ID":"c1c83d78-d385-47ec-b767-faa11cfd758b","Type":"ContainerDied","Data":"b2d2030c448e64d049b8df9bad559e86c7532bbe0364a3afe0c1e47d05fd5e16"} Apr 16 14:16:23.506406 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.506382 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:16:23.525187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.524652 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-dshm\") pod \"c1c83d78-d385-47ec-b767-faa11cfd758b\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " Apr 16 14:16:23.525187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.524696 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbc6h\" (UniqueName: \"kubernetes.io/projected/c1c83d78-d385-47ec-b767-faa11cfd758b-kube-api-access-zbc6h\") pod \"c1c83d78-d385-47ec-b767-faa11cfd758b\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " Apr 16 14:16:23.525187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.524753 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-kserve-provision-location\") pod \"c1c83d78-d385-47ec-b767-faa11cfd758b\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " Apr 16 14:16:23.525187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.524795 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-home\") pod \"c1c83d78-d385-47ec-b767-faa11cfd758b\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " Apr 16 14:16:23.525187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.524832 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-model-cache\") pod \"c1c83d78-d385-47ec-b767-faa11cfd758b\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " Apr 16 14:16:23.525187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.524867 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c1c83d78-d385-47ec-b767-faa11cfd758b-tls-certs\") pod \"c1c83d78-d385-47ec-b767-faa11cfd758b\" (UID: \"c1c83d78-d385-47ec-b767-faa11cfd758b\") " Apr 16 14:16:23.525187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.525135 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-home" (OuterVolumeSpecName: "home") pod "c1c83d78-d385-47ec-b767-faa11cfd758b" (UID: "c1c83d78-d385-47ec-b767-faa11cfd758b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:23.525717 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.525394 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-model-cache" (OuterVolumeSpecName: "model-cache") pod "c1c83d78-d385-47ec-b767-faa11cfd758b" (UID: "c1c83d78-d385-47ec-b767-faa11cfd758b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:23.527287 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.527197 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-dshm" (OuterVolumeSpecName: "dshm") pod "c1c83d78-d385-47ec-b767-faa11cfd758b" (UID: "c1c83d78-d385-47ec-b767-faa11cfd758b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:23.528375 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.528343 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1c83d78-d385-47ec-b767-faa11cfd758b-kube-api-access-zbc6h" (OuterVolumeSpecName: "kube-api-access-zbc6h") pod "c1c83d78-d385-47ec-b767-faa11cfd758b" (UID: "c1c83d78-d385-47ec-b767-faa11cfd758b"). InnerVolumeSpecName "kube-api-access-zbc6h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:16:23.529817 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.529781 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c83d78-d385-47ec-b767-faa11cfd758b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c1c83d78-d385-47ec-b767-faa11cfd758b" (UID: "c1c83d78-d385-47ec-b767-faa11cfd758b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:16:23.582967 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.582900 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c1c83d78-d385-47ec-b767-faa11cfd758b" (UID: "c1c83d78-d385-47ec-b767-faa11cfd758b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:23.626318 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.626288 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-kserve-provision-location\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:16:23.626318 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.626314 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-home\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:16:23.626318 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.626324 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-model-cache\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:16:23.626536 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.626334 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c1c83d78-d385-47ec-b767-faa11cfd758b-tls-certs\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:16:23.626536 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.626343 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c1c83d78-d385-47ec-b767-faa11cfd758b-dshm\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:16:23.626536 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:23.626352 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zbc6h\" (UniqueName: \"kubernetes.io/projected/c1c83d78-d385-47ec-b767-faa11cfd758b-kube-api-access-zbc6h\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:16:24.211934 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:24.211902 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" Apr 16 14:16:24.212364 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:24.211904 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c" event={"ID":"c1c83d78-d385-47ec-b767-faa11cfd758b","Type":"ContainerDied","Data":"c4c739734f4d83dc8d233622a7f4545c09b6b92357eff9aa8ec6c5c7f1cdca3d"} Apr 16 14:16:24.212364 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:24.212030 2569 scope.go:117] "RemoveContainer" containerID="b2d2030c448e64d049b8df9bad559e86c7532bbe0364a3afe0c1e47d05fd5e16" Apr 16 14:16:24.231112 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:24.231050 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c"] Apr 16 14:16:24.234105 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:24.234080 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5767b896d8tcj7c"] Apr 16 14:16:24.321527 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:24.321503 2569 scope.go:117] "RemoveContainer" containerID="74764e0d490a944e8511f3f95a60cb8ef5c44c7a08bc109e25a4a432b6a76753" Apr 16 14:16:25.221759 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:25.221714 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" event={"ID":"92c10ec2-fce8-4723-9b67-054bd0ed4a28","Type":"ContainerStarted","Data":"0084addd4146a03e18e77b808bc471a2954f5febaec0b7726173e6c0b4252fb1"} Apr 16 14:16:25.222217 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:25.221883 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:16:25.224405 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:25.224379 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 14:16:25.243923 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:25.243869 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" podStartSLOduration=3.183064024 podStartE2EDuration="35.243855144s" podCreationTimestamp="2026-04-16 14:15:50 +0000 UTC" firstStartedPulling="2026-04-16 14:15:53.069182884 +0000 UTC m=+988.077419118" lastFinishedPulling="2026-04-16 14:16:25.129973998 +0000 UTC m=+1020.138210238" observedRunningTime="2026-04-16 14:16:25.241770049 +0000 UTC m=+1020.250006323" watchObservedRunningTime="2026-04-16 14:16:25.243855144 +0000 UTC m=+1020.252091418" Apr 16 14:16:25.605101 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:25.605069 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1c83d78-d385-47ec-b767-faa11cfd758b" path="/var/lib/kubelet/pods/c1c83d78-d385-47ec-b767-faa11cfd758b/volumes" Apr 16 14:16:26.229038 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:26.228998 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 14:16:31.222124 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:31.222080 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:16:31.222124 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:31.222128 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:16:31.222787 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:31.222563 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.47:8082/healthz\": dial tcp 10.133.0.47:8082: connect: connection refused" Apr 16 14:16:31.223657 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:31.223632 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 14:16:38.473536 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:38.473501 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h"] Apr 16 14:16:38.474060 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:38.473864 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerName="tokenizer" containerID="cri-o://0084addd4146a03e18e77b808bc471a2954f5febaec0b7726173e6c0b4252fb1" gracePeriod=30 Apr 16 14:16:38.474060 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:38.473861 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerName="main" containerID="cri-o://0efe1cdc955fa301341ad072ecec3f61eec838b8f55a842680d4bd9535239a77" gracePeriod=30 Apr 16 14:16:38.475712 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:38.475682 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 14:16:39.280718 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:39.280685 2569 generic.go:358] "Generic (PLEG): container finished" podID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerID="0efe1cdc955fa301341ad072ecec3f61eec838b8f55a842680d4bd9535239a77" exitCode=0 Apr 16 14:16:39.280911 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:39.280765 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" event={"ID":"92c10ec2-fce8-4723-9b67-054bd0ed4a28","Type":"ContainerDied","Data":"0efe1cdc955fa301341ad072ecec3f61eec838b8f55a842680d4bd9535239a77"} Apr 16 14:16:39.828118 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:39.828093 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:16:39.971597 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:39.971559 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tls-certs\") pod \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " Apr 16 14:16:39.972681 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:39.971635 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-tmp\") pod \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " Apr 16 14:16:39.972681 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:39.971675 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-uds\") pod \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " Apr 16 14:16:39.972681 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:39.971737 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-kserve-provision-location\") pod \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " Apr 16 14:16:39.972681 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:39.971771 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-cache\") pod \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " Apr 16 14:16:39.972681 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:39.971812 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8dh4\" (UniqueName: \"kubernetes.io/projected/92c10ec2-fce8-4723-9b67-054bd0ed4a28-kube-api-access-r8dh4\") pod \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\" (UID: \"92c10ec2-fce8-4723-9b67-054bd0ed4a28\") " Apr 16 14:16:39.972681 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:39.972071 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "92c10ec2-fce8-4723-9b67-054bd0ed4a28" (UID: "92c10ec2-fce8-4723-9b67-054bd0ed4a28"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:39.972681 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:39.972278 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "92c10ec2-fce8-4723-9b67-054bd0ed4a28" (UID: "92c10ec2-fce8-4723-9b67-054bd0ed4a28"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:39.972681 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:39.972451 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "92c10ec2-fce8-4723-9b67-054bd0ed4a28" (UID: "92c10ec2-fce8-4723-9b67-054bd0ed4a28"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:39.973144 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:39.972786 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "92c10ec2-fce8-4723-9b67-054bd0ed4a28" (UID: "92c10ec2-fce8-4723-9b67-054bd0ed4a28"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:39.974818 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:39.974789 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "92c10ec2-fce8-4723-9b67-054bd0ed4a28" (UID: "92c10ec2-fce8-4723-9b67-054bd0ed4a28"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:16:39.974818 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:39.974798 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92c10ec2-fce8-4723-9b67-054bd0ed4a28-kube-api-access-r8dh4" (OuterVolumeSpecName: "kube-api-access-r8dh4") pod "92c10ec2-fce8-4723-9b67-054bd0ed4a28" (UID: "92c10ec2-fce8-4723-9b67-054bd0ed4a28"). InnerVolumeSpecName "kube-api-access-r8dh4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:16:40.073443 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.073408 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-cache\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:16:40.073443 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.073437 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r8dh4\" (UniqueName: \"kubernetes.io/projected/92c10ec2-fce8-4723-9b67-054bd0ed4a28-kube-api-access-r8dh4\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:16:40.073443 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.073448 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tls-certs\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:16:40.073689 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.073459 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-tmp\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:16:40.073689 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.073469 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-tokenizer-uds\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:16:40.073689 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.073478 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92c10ec2-fce8-4723-9b67-054bd0ed4a28-kserve-provision-location\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:16:40.286612 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.286517 2569 generic.go:358] "Generic (PLEG): container finished" podID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerID="0084addd4146a03e18e77b808bc471a2954f5febaec0b7726173e6c0b4252fb1" exitCode=0 Apr 16 14:16:40.286612 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.286600 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" Apr 16 14:16:40.286853 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.286598 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" event={"ID":"92c10ec2-fce8-4723-9b67-054bd0ed4a28","Type":"ContainerDied","Data":"0084addd4146a03e18e77b808bc471a2954f5febaec0b7726173e6c0b4252fb1"} Apr 16 14:16:40.286853 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.286718 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h" event={"ID":"92c10ec2-fce8-4723-9b67-054bd0ed4a28","Type":"ContainerDied","Data":"6a9d4e896069609a2d9116f66572285e277494319097a5ae53848bd26f1ed5d2"} Apr 16 14:16:40.286853 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.286735 2569 scope.go:117] "RemoveContainer" containerID="0084addd4146a03e18e77b808bc471a2954f5febaec0b7726173e6c0b4252fb1" Apr 16 14:16:40.296686 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.296671 2569 scope.go:117] "RemoveContainer" containerID="0efe1cdc955fa301341ad072ecec3f61eec838b8f55a842680d4bd9535239a77" Apr 16 14:16:40.304437 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.304420 2569 scope.go:117] "RemoveContainer" containerID="3e093accb662f682e26ecb78c180fb6753e64f2c5dffc5b6ac862d830a885f21" Apr 16 14:16:40.311014 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.310989 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h"] Apr 16 14:16:40.314146 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.314130 2569 scope.go:117] "RemoveContainer" containerID="0084addd4146a03e18e77b808bc471a2954f5febaec0b7726173e6c0b4252fb1" Apr 16 14:16:40.314426 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:16:40.314405 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0084addd4146a03e18e77b808bc471a2954f5febaec0b7726173e6c0b4252fb1\": container with ID starting with 0084addd4146a03e18e77b808bc471a2954f5febaec0b7726173e6c0b4252fb1 not found: ID does not exist" containerID="0084addd4146a03e18e77b808bc471a2954f5febaec0b7726173e6c0b4252fb1" Apr 16 14:16:40.314525 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.314435 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0084addd4146a03e18e77b808bc471a2954f5febaec0b7726173e6c0b4252fb1"} err="failed to get container status \"0084addd4146a03e18e77b808bc471a2954f5febaec0b7726173e6c0b4252fb1\": rpc error: code = NotFound desc = could not find container \"0084addd4146a03e18e77b808bc471a2954f5febaec0b7726173e6c0b4252fb1\": container with ID starting with 0084addd4146a03e18e77b808bc471a2954f5febaec0b7726173e6c0b4252fb1 not found: ID does not exist" Apr 16 14:16:40.314525 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.314455 2569 scope.go:117] "RemoveContainer" containerID="0efe1cdc955fa301341ad072ecec3f61eec838b8f55a842680d4bd9535239a77" Apr 16 14:16:40.314702 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:16:40.314684 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0efe1cdc955fa301341ad072ecec3f61eec838b8f55a842680d4bd9535239a77\": container with ID starting with 0efe1cdc955fa301341ad072ecec3f61eec838b8f55a842680d4bd9535239a77 not found: ID does not exist" containerID="0efe1cdc955fa301341ad072ecec3f61eec838b8f55a842680d4bd9535239a77" Apr 16 14:16:40.314742 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.314709 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0efe1cdc955fa301341ad072ecec3f61eec838b8f55a842680d4bd9535239a77"} err="failed to get container status \"0efe1cdc955fa301341ad072ecec3f61eec838b8f55a842680d4bd9535239a77\": rpc error: code = NotFound desc = could not find container \"0efe1cdc955fa301341ad072ecec3f61eec838b8f55a842680d4bd9535239a77\": container with ID starting with 0efe1cdc955fa301341ad072ecec3f61eec838b8f55a842680d4bd9535239a77 not found: ID does not exist" Apr 16 14:16:40.314742 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.314727 2569 scope.go:117] "RemoveContainer" containerID="3e093accb662f682e26ecb78c180fb6753e64f2c5dffc5b6ac862d830a885f21" Apr 16 14:16:40.314976 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:16:40.314960 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e093accb662f682e26ecb78c180fb6753e64f2c5dffc5b6ac862d830a885f21\": container with ID starting with 3e093accb662f682e26ecb78c180fb6753e64f2c5dffc5b6ac862d830a885f21 not found: ID does not exist" containerID="3e093accb662f682e26ecb78c180fb6753e64f2c5dffc5b6ac862d830a885f21" Apr 16 14:16:40.315037 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.314989 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e093accb662f682e26ecb78c180fb6753e64f2c5dffc5b6ac862d830a885f21"} err="failed to get container status \"3e093accb662f682e26ecb78c180fb6753e64f2c5dffc5b6ac862d830a885f21\": rpc error: code = NotFound desc = could not find container \"3e093accb662f682e26ecb78c180fb6753e64f2c5dffc5b6ac862d830a885f21\": container with ID starting with 3e093accb662f682e26ecb78c180fb6753e64f2c5dffc5b6ac862d830a885f21 not found: ID does not exist" Apr 16 14:16:40.315234 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:40.315211 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8jpc8h"] Apr 16 14:16:41.603553 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:41.603521 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" path="/var/lib/kubelet/pods/92c10ec2-fce8-4723-9b67-054bd0ed4a28/volumes" Apr 16 14:16:51.241412 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.241372 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l"] Apr 16 14:16:51.241922 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.241803 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerName="storage-initializer" Apr 16 14:16:51.241922 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.241817 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerName="storage-initializer" Apr 16 14:16:51.241922 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.241831 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1c83d78-d385-47ec-b767-faa11cfd758b" containerName="storage-initializer" Apr 16 14:16:51.241922 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.241836 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c83d78-d385-47ec-b767-faa11cfd758b" containerName="storage-initializer" Apr 16 14:16:51.241922 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.241852 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1c83d78-d385-47ec-b767-faa11cfd758b" containerName="main" Apr 16 14:16:51.241922 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.241858 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c83d78-d385-47ec-b767-faa11cfd758b" containerName="main" Apr 16 14:16:51.241922 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.241871 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerName="main" Apr 16 14:16:51.241922 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.241878 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerName="main" Apr 16 14:16:51.241922 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.241894 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerName="tokenizer" Apr 16 14:16:51.241922 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.241900 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerName="tokenizer" Apr 16 14:16:51.242239 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.241959 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerName="tokenizer" Apr 16 14:16:51.242239 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.241977 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="92c10ec2-fce8-4723-9b67-054bd0ed4a28" containerName="main" Apr 16 14:16:51.242239 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.241989 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1c83d78-d385-47ec-b767-faa11cfd758b" containerName="main" Apr 16 14:16:51.248430 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.248408 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.252426 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.252398 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xs8cb\"" Apr 16 14:16:51.252585 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.252444 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 14:16:51.258099 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.258071 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l"] Apr 16 14:16:51.374862 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.374817 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-home\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.374862 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.374859 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8xfk\" (UniqueName: \"kubernetes.io/projected/7621aae2-e6fd-48f8-8f10-3e44fde336bf-kube-api-access-p8xfk\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.375093 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.374925 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-dshm\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.375093 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.374979 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-model-cache\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.375093 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.375002 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7621aae2-e6fd-48f8-8f10-3e44fde336bf-tls-certs\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.375093 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.375023 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.455315 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.455281 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv"] Apr 16 14:16:51.469376 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.469341 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv"] Apr 16 14:16:51.469742 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.469725 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.476079 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.476046 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-dshm\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.476187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.476117 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-model-cache\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.476187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.476137 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-jzl88\"" Apr 16 14:16:51.476187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.476156 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7621aae2-e6fd-48f8-8f10-3e44fde336bf-tls-certs\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.476362 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.476191 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.476807 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.476780 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-home\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.476922 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.476828 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8xfk\" (UniqueName: \"kubernetes.io/projected/7621aae2-e6fd-48f8-8f10-3e44fde336bf-kube-api-access-p8xfk\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.477029 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.477006 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.477183 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.477028 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-model-cache\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.477480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.477455 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-home\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.479670 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.479563 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-dshm\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.479670 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.479610 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7621aae2-e6fd-48f8-8f10-3e44fde336bf-tls-certs\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.494487 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.494432 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8xfk\" (UniqueName: \"kubernetes.io/projected/7621aae2-e6fd-48f8-8f10-3e44fde336bf-kube-api-access-p8xfk\") pod \"precise-prefix-cache-test-kserve-576666b544-l2s2l\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.560402 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.560367 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:16:51.577464 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.577417 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.577652 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.577587 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.577743 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.577695 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.577813 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.577774 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6ssz\" (UniqueName: \"kubernetes.io/projected/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-kube-api-access-b6ssz\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.577869 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.577813 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.577968 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.577876 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.678931 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.678878 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.679135 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.678941 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.679135 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.679063 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6ssz\" (UniqueName: \"kubernetes.io/projected/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-kube-api-access-b6ssz\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.679135 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.679131 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.679368 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.679182 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.679368 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.679246 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.679368 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.679346 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.679536 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.679414 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.679747 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.679726 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.679899 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.679870 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.682511 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.682490 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.686863 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.686838 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6ssz\" (UniqueName: \"kubernetes.io/projected/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-kube-api-access-b6ssz\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.696116 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.696089 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l"] Apr 16 14:16:51.697505 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:16:51.697484 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7621aae2_e6fd_48f8_8f10_3e44fde336bf.slice/crio-48c11abe13a2493821a4a4c4b49b841d529e1ee1f8da92e2bc2063568fd0da8d WatchSource:0}: Error finding container 48c11abe13a2493821a4a4c4b49b841d529e1ee1f8da92e2bc2063568fd0da8d: Status 404 returned error can't find the container with id 48c11abe13a2493821a4a4c4b49b841d529e1ee1f8da92e2bc2063568fd0da8d Apr 16 14:16:51.699313 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.699298 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:16:51.790358 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.790318 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:51.943849 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:51.943810 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv"] Apr 16 14:16:51.945672 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:16:51.945633 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99a1bcc4_5df7_42b1_ab3e_bcc740e0c06d.slice/crio-4508d0eb0060ef6b71800ac782e9d516c25f834366877e103bd651fab687cb8d WatchSource:0}: Error finding container 4508d0eb0060ef6b71800ac782e9d516c25f834366877e103bd651fab687cb8d: Status 404 returned error can't find the container with id 4508d0eb0060ef6b71800ac782e9d516c25f834366877e103bd651fab687cb8d Apr 16 14:16:52.334592 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:52.334502 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" event={"ID":"7621aae2-e6fd-48f8-8f10-3e44fde336bf","Type":"ContainerStarted","Data":"4b80b270ed2ea367006ac37f05ae94a9f2ef54513ba37d6e67455d5b1f3525d6"} Apr 16 14:16:52.334592 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:52.334549 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" event={"ID":"7621aae2-e6fd-48f8-8f10-3e44fde336bf","Type":"ContainerStarted","Data":"48c11abe13a2493821a4a4c4b49b841d529e1ee1f8da92e2bc2063568fd0da8d"} Apr 16 14:16:52.335997 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:52.335971 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" event={"ID":"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d","Type":"ContainerStarted","Data":"30e7c305fe0f9a1a1b62183d1a5079b661c1e1b75be7cb7e795c052355c05e00"} Apr 16 14:16:52.335997 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:52.335999 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" event={"ID":"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d","Type":"ContainerStarted","Data":"4508d0eb0060ef6b71800ac782e9d516c25f834366877e103bd651fab687cb8d"} Apr 16 14:16:53.342013 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:53.341921 2569 generic.go:358] "Generic (PLEG): container finished" podID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerID="30e7c305fe0f9a1a1b62183d1a5079b661c1e1b75be7cb7e795c052355c05e00" exitCode=0 Apr 16 14:16:53.342013 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:53.342011 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" event={"ID":"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d","Type":"ContainerDied","Data":"30e7c305fe0f9a1a1b62183d1a5079b661c1e1b75be7cb7e795c052355c05e00"} Apr 16 14:16:54.348653 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:54.348616 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" event={"ID":"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d","Type":"ContainerStarted","Data":"fc37e94e4427ca83e2ad0e5fd919244faa726e2cf4edaea57705c3282bec4a54"} Apr 16 14:16:54.349071 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:54.348660 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" event={"ID":"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d","Type":"ContainerStarted","Data":"59aa723234176bbc824a705bf87d960c132a040e4f63f38d6c30e1b9a3e15d1b"} Apr 16 14:16:54.349071 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:54.348696 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:16:54.373326 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:54.373270 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" podStartSLOduration=3.373236869 podStartE2EDuration="3.373236869s" podCreationTimestamp="2026-04-16 14:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:16:54.369282538 +0000 UTC m=+1049.377518795" watchObservedRunningTime="2026-04-16 14:16:54.373236869 +0000 UTC m=+1049.381473123" Apr 16 14:16:56.373050 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:56.373003 2569 generic.go:358] "Generic (PLEG): container finished" podID="7621aae2-e6fd-48f8-8f10-3e44fde336bf" containerID="4b80b270ed2ea367006ac37f05ae94a9f2ef54513ba37d6e67455d5b1f3525d6" exitCode=0 Apr 16 14:16:56.373596 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:56.373048 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" event={"ID":"7621aae2-e6fd-48f8-8f10-3e44fde336bf","Type":"ContainerDied","Data":"4b80b270ed2ea367006ac37f05ae94a9f2ef54513ba37d6e67455d5b1f3525d6"} Apr 16 14:16:57.380037 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:57.379997 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" event={"ID":"7621aae2-e6fd-48f8-8f10-3e44fde336bf","Type":"ContainerStarted","Data":"738e13ef89a7614b0092dedf70c6244b26ad8807578f74d558c3786065706760"} Apr 16 14:16:57.402812 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:16:57.402763 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" podStartSLOduration=6.402747918 podStartE2EDuration="6.402747918s" podCreationTimestamp="2026-04-16 14:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:16:57.399437953 +0000 UTC m=+1052.407674207" watchObservedRunningTime="2026-04-16 14:16:57.402747918 +0000 UTC m=+1052.410984174" Apr 16 14:17:01.561446 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:01.561410 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:17:01.561446 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:01.561452 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:17:01.574182 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:01.574152 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:17:01.790811 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:01.790770 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:17:01.791001 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:01.790823 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:17:01.792194 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:17:01.792165 2569 logging.go:55] [core] [Channel #32 SubChannel #33]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.49:9003", ServerName: "10.133.0.49:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.49:9003: connect: connection refused" Apr 16 14:17:01.793566 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:01.793542 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:17:02.405036 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:02.404997 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:17:02.415402 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:02.415374 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:17:02.791041 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:02.790998 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.49:9003\" within 1s: context deadline exceeded" Apr 16 14:17:04.412450 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:04.412422 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv_99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d/main/0.log" Apr 16 14:17:04.412861 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:04.412756 2569 generic.go:358] "Generic (PLEG): container finished" podID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerID="59aa723234176bbc824a705bf87d960c132a040e4f63f38d6c30e1b9a3e15d1b" exitCode=1 Apr 16 14:17:04.412861 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:04.412832 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" event={"ID":"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d","Type":"ContainerDied","Data":"59aa723234176bbc824a705bf87d960c132a040e4f63f38d6c30e1b9a3e15d1b"} Apr 16 14:17:04.413303 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:04.413288 2569 scope.go:117] "RemoveContainer" containerID="59aa723234176bbc824a705bf87d960c132a040e4f63f38d6c30e1b9a3e15d1b" Apr 16 14:17:05.419149 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:05.419118 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv_99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d/main/0.log" Apr 16 14:17:05.419671 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:05.419490 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" event={"ID":"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d","Type":"ContainerStarted","Data":"517b63de031d804a23c20faec758e452662095de9e8ba3456901639ced0f719e"} Apr 16 14:17:05.419801 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:05.419781 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:17:11.791633 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:17:11.791595 2569 logging.go:55] [core] [Channel #34 SubChannel #35]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.49:9003", ServerName: "10.133.0.49:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.49:9003: connect: connection refused" Apr 16 14:17:12.792163 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:12.792113 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.49:9003\" within 1s: context deadline exceeded" Apr 16 14:17:36.425757 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:36.425719 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:17:37.386372 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.386333 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l"] Apr 16 14:17:37.386664 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.386627 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" podUID="7621aae2-e6fd-48f8-8f10-3e44fde336bf" containerName="main" containerID="cri-o://738e13ef89a7614b0092dedf70c6244b26ad8807578f74d558c3786065706760" gracePeriod=30 Apr 16 14:17:37.393960 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.393928 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv"] Apr 16 14:17:37.394844 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.394327 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerName="tokenizer" containerID="cri-o://fc37e94e4427ca83e2ad0e5fd919244faa726e2cf4edaea57705c3282bec4a54" gracePeriod=30 Apr 16 14:17:37.394844 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.394447 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerName="main" containerID="cri-o://517b63de031d804a23c20faec758e452662095de9e8ba3456901639ced0f719e" gracePeriod=30 Apr 16 14:17:37.550851 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.550823 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv_99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d/main/0.log" Apr 16 14:17:37.551332 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.551215 2569 generic.go:358] "Generic (PLEG): container finished" podID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerID="517b63de031d804a23c20faec758e452662095de9e8ba3456901639ced0f719e" exitCode=0 Apr 16 14:17:37.551332 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.551291 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" event={"ID":"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d","Type":"ContainerDied","Data":"517b63de031d804a23c20faec758e452662095de9e8ba3456901639ced0f719e"} Apr 16 14:17:37.551461 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.551344 2569 scope.go:117] "RemoveContainer" containerID="59aa723234176bbc824a705bf87d960c132a040e4f63f38d6c30e1b9a3e15d1b" Apr 16 14:17:37.553291 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.553247 2569 generic.go:358] "Generic (PLEG): container finished" podID="7621aae2-e6fd-48f8-8f10-3e44fde336bf" containerID="738e13ef89a7614b0092dedf70c6244b26ad8807578f74d558c3786065706760" exitCode=0 Apr 16 14:17:37.553403 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.553325 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" event={"ID":"7621aae2-e6fd-48f8-8f10-3e44fde336bf","Type":"ContainerDied","Data":"738e13ef89a7614b0092dedf70c6244b26ad8807578f74d558c3786065706760"} Apr 16 14:17:37.666678 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.666650 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:17:37.812464 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.812429 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7621aae2-e6fd-48f8-8f10-3e44fde336bf-tls-certs\") pod \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " Apr 16 14:17:37.812464 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.812472 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-model-cache\") pod \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " Apr 16 14:17:37.812734 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.812532 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8xfk\" (UniqueName: \"kubernetes.io/projected/7621aae2-e6fd-48f8-8f10-3e44fde336bf-kube-api-access-p8xfk\") pod \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " Apr 16 14:17:37.812734 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.812564 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-home\") pod \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " Apr 16 14:17:37.812734 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.812628 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-kserve-provision-location\") pod \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " Apr 16 14:17:37.812734 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.812674 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-dshm\") pod \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\" (UID: \"7621aae2-e6fd-48f8-8f10-3e44fde336bf\") " Apr 16 14:17:37.812953 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.812740 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-model-cache" (OuterVolumeSpecName: "model-cache") pod "7621aae2-e6fd-48f8-8f10-3e44fde336bf" (UID: "7621aae2-e6fd-48f8-8f10-3e44fde336bf"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:37.812953 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.812834 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-home" (OuterVolumeSpecName: "home") pod "7621aae2-e6fd-48f8-8f10-3e44fde336bf" (UID: "7621aae2-e6fd-48f8-8f10-3e44fde336bf"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:37.812953 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.812929 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-home\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:17:37.813085 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.812956 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-model-cache\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:17:37.814929 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.814895 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7621aae2-e6fd-48f8-8f10-3e44fde336bf-kube-api-access-p8xfk" (OuterVolumeSpecName: "kube-api-access-p8xfk") pod "7621aae2-e6fd-48f8-8f10-3e44fde336bf" (UID: "7621aae2-e6fd-48f8-8f10-3e44fde336bf"). InnerVolumeSpecName "kube-api-access-p8xfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:17:37.814929 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.814895 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-dshm" (OuterVolumeSpecName: "dshm") pod "7621aae2-e6fd-48f8-8f10-3e44fde336bf" (UID: "7621aae2-e6fd-48f8-8f10-3e44fde336bf"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:37.815120 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.814926 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7621aae2-e6fd-48f8-8f10-3e44fde336bf-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7621aae2-e6fd-48f8-8f10-3e44fde336bf" (UID: "7621aae2-e6fd-48f8-8f10-3e44fde336bf"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:17:37.875936 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.875879 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7621aae2-e6fd-48f8-8f10-3e44fde336bf" (UID: "7621aae2-e6fd-48f8-8f10-3e44fde336bf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:37.914375 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.914344 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-dshm\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:17:37.914375 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.914375 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7621aae2-e6fd-48f8-8f10-3e44fde336bf-tls-certs\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:17:37.914528 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.914385 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p8xfk\" (UniqueName: \"kubernetes.io/projected/7621aae2-e6fd-48f8-8f10-3e44fde336bf-kube-api-access-p8xfk\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:17:37.914528 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:37.914396 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7621aae2-e6fd-48f8-8f10-3e44fde336bf-kserve-provision-location\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:17:38.558860 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:38.558815 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" event={"ID":"7621aae2-e6fd-48f8-8f10-3e44fde336bf","Type":"ContainerDied","Data":"48c11abe13a2493821a4a4c4b49b841d529e1ee1f8da92e2bc2063568fd0da8d"} Apr 16 14:17:38.559363 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:38.558877 2569 scope.go:117] "RemoveContainer" containerID="738e13ef89a7614b0092dedf70c6244b26ad8807578f74d558c3786065706760" Apr 16 14:17:38.559363 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:38.558841 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l" Apr 16 14:17:38.568929 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:38.568910 2569 scope.go:117] "RemoveContainer" containerID="4b80b270ed2ea367006ac37f05ae94a9f2ef54513ba37d6e67455d5b1f3525d6" Apr 16 14:17:38.583562 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:38.583523 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l"] Apr 16 14:17:38.586856 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:38.586832 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-l2s2l"] Apr 16 14:17:39.046014 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.045989 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:17:39.227420 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.227390 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-cache\") pod \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " Apr 16 14:17:39.227607 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.227444 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-tmp\") pod \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " Apr 16 14:17:39.227607 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.227467 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-kserve-provision-location\") pod \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " Apr 16 14:17:39.227607 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.227497 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tls-certs\") pod \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " Apr 16 14:17:39.227607 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.227530 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-uds\") pod \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " Apr 16 14:17:39.227607 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.227577 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6ssz\" (UniqueName: \"kubernetes.io/projected/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-kube-api-access-b6ssz\") pod \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\" (UID: \"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d\") " Apr 16 14:17:39.227888 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.227769 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" (UID: "99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:39.227888 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.227800 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" (UID: "99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:39.227888 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.227863 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" (UID: "99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:39.228199 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.228178 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" (UID: "99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:39.229742 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.229721 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" (UID: "99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:17:39.229802 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.229772 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-kube-api-access-b6ssz" (OuterVolumeSpecName: "kube-api-access-b6ssz") pod "99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" (UID: "99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d"). InnerVolumeSpecName "kube-api-access-b6ssz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:17:39.329132 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.329089 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-cache\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:17:39.329132 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.329127 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-tmp\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:17:39.329132 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.329138 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-kserve-provision-location\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:17:39.329392 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.329149 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tls-certs\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:17:39.329392 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.329161 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-tokenizer-uds\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:17:39.329392 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.329174 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b6ssz\" (UniqueName: \"kubernetes.io/projected/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d-kube-api-access-b6ssz\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:17:39.567481 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.567396 2569 generic.go:358] "Generic (PLEG): container finished" podID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerID="fc37e94e4427ca83e2ad0e5fd919244faa726e2cf4edaea57705c3282bec4a54" exitCode=0 Apr 16 14:17:39.567481 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.567472 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" Apr 16 14:17:39.568029 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.567483 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" event={"ID":"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d","Type":"ContainerDied","Data":"fc37e94e4427ca83e2ad0e5fd919244faa726e2cf4edaea57705c3282bec4a54"} Apr 16 14:17:39.568029 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.567522 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv" event={"ID":"99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d","Type":"ContainerDied","Data":"4508d0eb0060ef6b71800ac782e9d516c25f834366877e103bd651fab687cb8d"} Apr 16 14:17:39.568029 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.567539 2569 scope.go:117] "RemoveContainer" containerID="517b63de031d804a23c20faec758e452662095de9e8ba3456901639ced0f719e" Apr 16 14:17:39.576725 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.576708 2569 scope.go:117] "RemoveContainer" containerID="fc37e94e4427ca83e2ad0e5fd919244faa726e2cf4edaea57705c3282bec4a54" Apr 16 14:17:39.585116 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.585093 2569 scope.go:117] "RemoveContainer" containerID="30e7c305fe0f9a1a1b62183d1a5079b661c1e1b75be7cb7e795c052355c05e00" Apr 16 14:17:39.590896 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.590873 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv"] Apr 16 14:17:39.593949 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.593919 2569 scope.go:117] "RemoveContainer" containerID="517b63de031d804a23c20faec758e452662095de9e8ba3456901639ced0f719e" Apr 16 14:17:39.594318 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:17:39.594293 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"517b63de031d804a23c20faec758e452662095de9e8ba3456901639ced0f719e\": container with ID starting with 517b63de031d804a23c20faec758e452662095de9e8ba3456901639ced0f719e not found: ID does not exist" containerID="517b63de031d804a23c20faec758e452662095de9e8ba3456901639ced0f719e" Apr 16 14:17:39.594397 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.594331 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"517b63de031d804a23c20faec758e452662095de9e8ba3456901639ced0f719e"} err="failed to get container status \"517b63de031d804a23c20faec758e452662095de9e8ba3456901639ced0f719e\": rpc error: code = NotFound desc = could not find container \"517b63de031d804a23c20faec758e452662095de9e8ba3456901639ced0f719e\": container with ID starting with 517b63de031d804a23c20faec758e452662095de9e8ba3456901639ced0f719e not found: ID does not exist" Apr 16 14:17:39.594397 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.594356 2569 scope.go:117] "RemoveContainer" containerID="fc37e94e4427ca83e2ad0e5fd919244faa726e2cf4edaea57705c3282bec4a54" Apr 16 14:17:39.594647 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:17:39.594628 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc37e94e4427ca83e2ad0e5fd919244faa726e2cf4edaea57705c3282bec4a54\": container with ID starting with fc37e94e4427ca83e2ad0e5fd919244faa726e2cf4edaea57705c3282bec4a54 not found: ID does not exist" containerID="fc37e94e4427ca83e2ad0e5fd919244faa726e2cf4edaea57705c3282bec4a54" Apr 16 14:17:39.594705 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.594652 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc37e94e4427ca83e2ad0e5fd919244faa726e2cf4edaea57705c3282bec4a54"} err="failed to get container status \"fc37e94e4427ca83e2ad0e5fd919244faa726e2cf4edaea57705c3282bec4a54\": rpc error: code = NotFound desc = could not find container \"fc37e94e4427ca83e2ad0e5fd919244faa726e2cf4edaea57705c3282bec4a54\": container with ID starting with fc37e94e4427ca83e2ad0e5fd919244faa726e2cf4edaea57705c3282bec4a54 not found: ID does not exist" Apr 16 14:17:39.594705 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.594678 2569 scope.go:117] "RemoveContainer" containerID="30e7c305fe0f9a1a1b62183d1a5079b661c1e1b75be7cb7e795c052355c05e00" Apr 16 14:17:39.594942 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:17:39.594924 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30e7c305fe0f9a1a1b62183d1a5079b661c1e1b75be7cb7e795c052355c05e00\": container with ID starting with 30e7c305fe0f9a1a1b62183d1a5079b661c1e1b75be7cb7e795c052355c05e00 not found: ID does not exist" containerID="30e7c305fe0f9a1a1b62183d1a5079b661c1e1b75be7cb7e795c052355c05e00" Apr 16 14:17:39.594994 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.594951 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30e7c305fe0f9a1a1b62183d1a5079b661c1e1b75be7cb7e795c052355c05e00"} err="failed to get container status \"30e7c305fe0f9a1a1b62183d1a5079b661c1e1b75be7cb7e795c052355c05e00\": rpc error: code = NotFound desc = could not find container \"30e7c305fe0f9a1a1b62183d1a5079b661c1e1b75be7cb7e795c052355c05e00\": container with ID starting with 30e7c305fe0f9a1a1b62183d1a5079b661c1e1b75be7cb7e795c052355c05e00 not found: ID does not exist" Apr 16 14:17:39.595652 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.595634 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f555b7p6hpv"] Apr 16 14:17:39.603666 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.603648 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7621aae2-e6fd-48f8-8f10-3e44fde336bf" path="/var/lib/kubelet/pods/7621aae2-e6fd-48f8-8f10-3e44fde336bf/volumes" Apr 16 14:17:39.604087 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:39.604074 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" path="/var/lib/kubelet/pods/99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d/volumes" Apr 16 14:17:51.699928 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.699883 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k"] Apr 16 14:17:51.700561 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700451 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7621aae2-e6fd-48f8-8f10-3e44fde336bf" containerName="main" Apr 16 14:17:51.700561 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700470 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7621aae2-e6fd-48f8-8f10-3e44fde336bf" containerName="main" Apr 16 14:17:51.700561 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700487 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerName="main" Apr 16 14:17:51.700561 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700496 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerName="main" Apr 16 14:17:51.700561 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700509 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerName="main" Apr 16 14:17:51.700561 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700518 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerName="main" Apr 16 14:17:51.700561 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700532 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerName="tokenizer" Apr 16 14:17:51.700561 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700539 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerName="tokenizer" Apr 16 14:17:51.700561 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700554 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerName="storage-initializer" Apr 16 14:17:51.700561 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700563 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerName="storage-initializer" Apr 16 14:17:51.701124 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700576 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7621aae2-e6fd-48f8-8f10-3e44fde336bf" containerName="storage-initializer" Apr 16 14:17:51.701124 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700584 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7621aae2-e6fd-48f8-8f10-3e44fde336bf" containerName="storage-initializer" Apr 16 14:17:51.701124 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700680 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerName="main" Apr 16 14:17:51.701124 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700693 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerName="tokenizer" Apr 16 14:17:51.701124 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700705 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="99a1bcc4-5df7-42b1-ab3e-bcc740e0c06d" containerName="main" Apr 16 14:17:51.701124 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.700718 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7621aae2-e6fd-48f8-8f10-3e44fde336bf" containerName="main" Apr 16 14:17:51.704487 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.704462 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.708143 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.708117 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 14:17:51.708315 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.708115 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xs8cb\"" Apr 16 14:17:51.708315 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.708117 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-srqv5\"" Apr 16 14:17:51.716888 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.716858 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k"] Apr 16 14:17:51.741129 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.740803 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.741129 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.740855 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.741129 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.740884 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkmng\" (UniqueName: \"kubernetes.io/projected/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-kube-api-access-qkmng\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.741129 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.740919 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.741129 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.740945 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.741129 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.740987 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.842142 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.842092 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.842142 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.842140 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.842453 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.842170 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmng\" (UniqueName: \"kubernetes.io/projected/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-kube-api-access-qkmng\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.842453 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.842203 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.842453 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.842231 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.842453 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.842310 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.842671 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.842554 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.842671 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.842619 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.842671 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.842649 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.842810 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.842709 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.844817 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.844792 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:51.856189 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:51.856158 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkmng\" (UniqueName: \"kubernetes.io/projected/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-kube-api-access-qkmng\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:52.018321 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:52.018192 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:52.156006 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:52.155981 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k"] Apr 16 14:17:52.157789 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:17:52.157754 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8a9a8ef_cbcd_4c84_bfd8_ffeb272de439.slice/crio-94725e80a58c222c9fc8aaacbd020b3b8c00fc86ecbe553a86c4874ce4e24213 WatchSource:0}: Error finding container 94725e80a58c222c9fc8aaacbd020b3b8c00fc86ecbe553a86c4874ce4e24213: Status 404 returned error can't find the container with id 94725e80a58c222c9fc8aaacbd020b3b8c00fc86ecbe553a86c4874ce4e24213 Apr 16 14:17:52.626652 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:52.626608 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" event={"ID":"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439","Type":"ContainerStarted","Data":"caaaa3a2b45d4430006c31c1c0eca552b8c6d6839f4d8f64cf73a99eb11391e2"} Apr 16 14:17:52.626832 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:52.626659 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" event={"ID":"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439","Type":"ContainerStarted","Data":"94725e80a58c222c9fc8aaacbd020b3b8c00fc86ecbe553a86c4874ce4e24213"} Apr 16 14:17:53.632123 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:53.632091 2569 generic.go:358] "Generic (PLEG): container finished" podID="b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" containerID="caaaa3a2b45d4430006c31c1c0eca552b8c6d6839f4d8f64cf73a99eb11391e2" exitCode=0 Apr 16 14:17:53.632649 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:53.632167 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" event={"ID":"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439","Type":"ContainerDied","Data":"caaaa3a2b45d4430006c31c1c0eca552b8c6d6839f4d8f64cf73a99eb11391e2"} Apr 16 14:17:54.639267 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:54.639223 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" event={"ID":"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439","Type":"ContainerStarted","Data":"8811d0b6d953b6450e709f1bee5b7877b2ed795ab2a8bb8637fefefcf1776b8f"} Apr 16 14:17:54.639267 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:54.639274 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" event={"ID":"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439","Type":"ContainerStarted","Data":"a2ec28462e2218aa8601716a880c71fa31fbfe069cb3e3eda2bcadb0fd92b982"} Apr 16 14:17:54.639764 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:54.639363 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:17:54.660574 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:17:54.660527 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" podStartSLOduration=3.660509662 podStartE2EDuration="3.660509662s" podCreationTimestamp="2026-04-16 14:17:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:17:54.658281732 +0000 UTC m=+1109.666517978" watchObservedRunningTime="2026-04-16 14:17:54.660509662 +0000 UTC m=+1109.668745916" Apr 16 14:18:02.018831 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:18:02.018789 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:18:02.018831 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:18:02.018837 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:18:02.021853 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:18:02.021826 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:18:02.671939 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:18:02.671898 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:18:23.676708 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:18:23.676674 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:19:25.572341 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:25.572312 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:19:25.573721 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:25.573697 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:19:32.794431 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:32.794392 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k"] Apr 16 14:19:32.794857 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:32.794691 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" podUID="b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" containerName="main" containerID="cri-o://a2ec28462e2218aa8601716a880c71fa31fbfe069cb3e3eda2bcadb0fd92b982" gracePeriod=30 Apr 16 14:19:32.794857 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:32.794742 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" podUID="b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" containerName="tokenizer" containerID="cri-o://8811d0b6d953b6450e709f1bee5b7877b2ed795ab2a8bb8637fefefcf1776b8f" gracePeriod=30 Apr 16 14:19:33.017874 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:33.017834 2569 generic.go:358] "Generic (PLEG): container finished" podID="b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" containerID="a2ec28462e2218aa8601716a880c71fa31fbfe069cb3e3eda2bcadb0fd92b982" exitCode=0 Apr 16 14:19:33.018046 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:33.017899 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" event={"ID":"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439","Type":"ContainerDied","Data":"a2ec28462e2218aa8601716a880c71fa31fbfe069cb3e3eda2bcadb0fd92b982"} Apr 16 14:19:33.674930 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:19:33.674901 2569 logging.go:55] [core] [Channel #96 SubChannel #97]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.50:9003", ServerName: "10.133.0.50:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.50:9003: connect: connection refused" Apr 16 14:19:34.024631 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.024598 2569 generic.go:358] "Generic (PLEG): container finished" podID="b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" containerID="8811d0b6d953b6450e709f1bee5b7877b2ed795ab2a8bb8637fefefcf1776b8f" exitCode=0 Apr 16 14:19:34.024631 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.024622 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" event={"ID":"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439","Type":"ContainerDied","Data":"8811d0b6d953b6450e709f1bee5b7877b2ed795ab2a8bb8637fefefcf1776b8f"} Apr 16 14:19:34.145356 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.145330 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:19:34.173486 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.173451 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tls-certs\") pod \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " Apr 16 14:19:34.173641 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.173492 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-uds\") pod \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " Apr 16 14:19:34.173641 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.173514 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkmng\" (UniqueName: \"kubernetes.io/projected/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-kube-api-access-qkmng\") pod \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " Apr 16 14:19:34.173641 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.173542 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-cache\") pod \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " Apr 16 14:19:34.173641 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.173594 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-kserve-provision-location\") pod \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " Apr 16 14:19:34.173828 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.173677 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-tmp\") pod \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\" (UID: \"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439\") " Apr 16 14:19:34.173828 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.173766 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" (UID: "b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:19:34.173939 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.173845 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" (UID: "b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:19:34.174006 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.173962 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-uds\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:19:34.174006 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.173983 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-cache\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:19:34.174114 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.174065 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" (UID: "b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:19:34.174433 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.174412 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" (UID: "b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:19:34.175735 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.175715 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-kube-api-access-qkmng" (OuterVolumeSpecName: "kube-api-access-qkmng") pod "b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" (UID: "b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439"). InnerVolumeSpecName "kube-api-access-qkmng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:19:34.175735 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.175714 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" (UID: "b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:19:34.275203 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.275097 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tokenizer-tmp\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:19:34.275203 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.275135 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-tls-certs\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:19:34.275203 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.275148 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qkmng\" (UniqueName: \"kubernetes.io/projected/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-kube-api-access-qkmng\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:19:34.275203 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.275162 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439-kserve-provision-location\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:19:34.675394 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:34.675350 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" podUID="b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.50:9003\" within 1s: context deadline exceeded" Apr 16 14:19:34.675566 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:19:34.675372 2569 logging.go:55] [core] [Channel #96 SubChannel #97]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.50:9003", ServerName: "10.133.0.50:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.50:9003: operation was canceled" Apr 16 14:19:35.030320 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:35.030229 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" Apr 16 14:19:35.030749 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:35.030229 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k" event={"ID":"b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439","Type":"ContainerDied","Data":"94725e80a58c222c9fc8aaacbd020b3b8c00fc86ecbe553a86c4874ce4e24213"} Apr 16 14:19:35.030749 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:35.030383 2569 scope.go:117] "RemoveContainer" containerID="8811d0b6d953b6450e709f1bee5b7877b2ed795ab2a8bb8637fefefcf1776b8f" Apr 16 14:19:35.039859 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:35.039836 2569 scope.go:117] "RemoveContainer" containerID="a2ec28462e2218aa8601716a880c71fa31fbfe069cb3e3eda2bcadb0fd92b982" Apr 16 14:19:35.048005 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:35.047986 2569 scope.go:117] "RemoveContainer" containerID="caaaa3a2b45d4430006c31c1c0eca552b8c6d6839f4d8f64cf73a99eb11391e2" Apr 16 14:19:35.053456 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:35.053427 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k"] Apr 16 14:19:35.056802 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:35.056775 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-nkr9k"] Apr 16 14:19:35.603666 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:35.603636 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" path="/var/lib/kubelet/pods/b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439/volumes" Apr 16 14:19:46.448016 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.447978 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj"] Apr 16 14:19:46.448616 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.448351 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" containerName="main" Apr 16 14:19:46.448616 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.448365 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" containerName="main" Apr 16 14:19:46.448616 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.448374 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" containerName="storage-initializer" Apr 16 14:19:46.448616 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.448380 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" containerName="storage-initializer" Apr 16 14:19:46.448616 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.448387 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" containerName="tokenizer" Apr 16 14:19:46.448616 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.448393 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" containerName="tokenizer" Apr 16 14:19:46.448616 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.448457 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" containerName="main" Apr 16 14:19:46.448616 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.448466 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8a9a8ef-cbcd-4c84-bfd8-ffeb272de439" containerName="tokenizer" Apr 16 14:19:46.453231 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.453213 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.457095 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.457066 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xs8cb\"" Apr 16 14:19:46.457344 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.457067 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 14:19:46.457344 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.457078 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-b2p92\"" Apr 16 14:19:46.465218 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.465191 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj"] Apr 16 14:19:46.579172 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.579110 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.579369 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.579237 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.579369 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.579309 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.579369 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.579358 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.579523 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.579380 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.579523 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.579407 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mljx7\" (UniqueName: \"kubernetes.io/projected/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-kube-api-access-mljx7\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.680569 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.680512 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.680784 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.680687 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.680784 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.680746 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.680990 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.680956 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.680990 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.680981 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.681136 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.681019 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.681136 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.681036 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.681136 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.681051 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mljx7\" (UniqueName: \"kubernetes.io/projected/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-kube-api-access-mljx7\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.681331 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.681223 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.681331 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.681291 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.683125 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.683104 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.689476 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.689452 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mljx7\" (UniqueName: \"kubernetes.io/projected/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-kube-api-access-mljx7\") pod \"stop-feature-test-kserve-router-scheduler-57bf549665-67rzj\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.765712 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.765603 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:46.901887 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:46.901854 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj"] Apr 16 14:19:46.903748 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:19:46.903716 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2a9f62b_4a94_4f52_a26d_cfc15e1968b9.slice/crio-10fc16f56fc8a7946a59a82175bf72544dbef1e1a656b6468408cec8eb8c0b09 WatchSource:0}: Error finding container 10fc16f56fc8a7946a59a82175bf72544dbef1e1a656b6468408cec8eb8c0b09: Status 404 returned error can't find the container with id 10fc16f56fc8a7946a59a82175bf72544dbef1e1a656b6468408cec8eb8c0b09 Apr 16 14:19:47.084898 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:47.084797 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" event={"ID":"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9","Type":"ContainerStarted","Data":"82810b8d68b84045e87f68359607e9582fd5705978eb208ee4f94289696693d9"} Apr 16 14:19:47.084898 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:47.084845 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" event={"ID":"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9","Type":"ContainerStarted","Data":"10fc16f56fc8a7946a59a82175bf72544dbef1e1a656b6468408cec8eb8c0b09"} Apr 16 14:19:48.090109 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:48.090032 2569 generic.go:358] "Generic (PLEG): container finished" podID="e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" containerID="82810b8d68b84045e87f68359607e9582fd5705978eb208ee4f94289696693d9" exitCode=0 Apr 16 14:19:48.090109 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:48.090076 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" event={"ID":"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9","Type":"ContainerDied","Data":"82810b8d68b84045e87f68359607e9582fd5705978eb208ee4f94289696693d9"} Apr 16 14:19:49.095684 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:49.095648 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" event={"ID":"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9","Type":"ContainerStarted","Data":"0152b99afc4e4eddca8c3e3481ef218e3c1ac1266fd9bc4f5770803fe8866af3"} Apr 16 14:19:49.095684 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:49.095691 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" event={"ID":"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9","Type":"ContainerStarted","Data":"31e3acf17e0c13bce83ffffd866af4dac5065699c4f0f28df9df133d60522418"} Apr 16 14:19:49.096175 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:49.095738 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:49.119019 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:49.118968 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" podStartSLOduration=3.118953886 podStartE2EDuration="3.118953886s" podCreationTimestamp="2026-04-16 14:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:19:49.116045806 +0000 UTC m=+1224.124282060" watchObservedRunningTime="2026-04-16 14:19:49.118953886 +0000 UTC m=+1224.127190140" Apr 16 14:19:56.766740 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:56.766691 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:56.766740 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:56.766751 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:56.769436 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:56.769413 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:19:57.130506 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:19:57.130419 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:20:18.134686 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:20:18.134613 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:21:37.933311 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:37.933265 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj"] Apr 16 14:21:37.933805 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:37.933597 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" podUID="e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" containerName="main" containerID="cri-o://31e3acf17e0c13bce83ffffd866af4dac5065699c4f0f28df9df133d60522418" gracePeriod=30 Apr 16 14:21:37.933805 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:37.933632 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" podUID="e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" containerName="tokenizer" containerID="cri-o://0152b99afc4e4eddca8c3e3481ef218e3c1ac1266fd9bc4f5770803fe8866af3" gracePeriod=30 Apr 16 14:21:38.134729 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:21:38.134688 2569 logging.go:55] [core] [Channel #155 SubChannel #156]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.51:9003", ServerName: "10.133.0.51:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.51:9003: connect: connection refused" Apr 16 14:21:38.534245 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:38.534213 2569 generic.go:358] "Generic (PLEG): container finished" podID="e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" containerID="31e3acf17e0c13bce83ffffd866af4dac5065699c4f0f28df9df133d60522418" exitCode=0 Apr 16 14:21:38.534440 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:38.534267 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" event={"ID":"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9","Type":"ContainerDied","Data":"31e3acf17e0c13bce83ffffd866af4dac5065699c4f0f28df9df133d60522418"} Apr 16 14:21:39.134696 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.134650 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" podUID="e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.51:9003\" within 1s: context deadline exceeded" Apr 16 14:21:39.291675 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.291648 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:21:39.391672 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.391575 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mljx7\" (UniqueName: \"kubernetes.io/projected/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-kube-api-access-mljx7\") pod \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " Apr 16 14:21:39.391672 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.391666 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-kserve-provision-location\") pod \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " Apr 16 14:21:39.391919 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.391752 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-tmp\") pod \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " Apr 16 14:21:39.391919 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.391786 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-cache\") pod \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " Apr 16 14:21:39.391919 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.391853 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-uds\") pod \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " Apr 16 14:21:39.391919 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.391886 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tls-certs\") pod \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\" (UID: \"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9\") " Apr 16 14:21:39.392132 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.392103 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" (UID: "e2a9f62b-4a94-4f52-a26d-cfc15e1968b9"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:21:39.392132 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.392119 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" (UID: "e2a9f62b-4a94-4f52-a26d-cfc15e1968b9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:21:39.392233 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.392135 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" (UID: "e2a9f62b-4a94-4f52-a26d-cfc15e1968b9"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:21:39.392499 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.392476 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" (UID: "e2a9f62b-4a94-4f52-a26d-cfc15e1968b9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:21:39.393941 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.393916 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-kube-api-access-mljx7" (OuterVolumeSpecName: "kube-api-access-mljx7") pod "e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" (UID: "e2a9f62b-4a94-4f52-a26d-cfc15e1968b9"). InnerVolumeSpecName "kube-api-access-mljx7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:21:39.394070 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.393986 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" (UID: "e2a9f62b-4a94-4f52-a26d-cfc15e1968b9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:21:39.493511 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.493472 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tls-certs\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:21:39.493511 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.493505 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mljx7\" (UniqueName: \"kubernetes.io/projected/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-kube-api-access-mljx7\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:21:39.493511 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.493517 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-kserve-provision-location\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:21:39.493760 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.493527 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-tmp\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:21:39.493760 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.493536 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-cache\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:21:39.493760 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.493545 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9-tokenizer-uds\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:21:39.540360 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.540328 2569 generic.go:358] "Generic (PLEG): container finished" podID="e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" containerID="0152b99afc4e4eddca8c3e3481ef218e3c1ac1266fd9bc4f5770803fe8866af3" exitCode=0 Apr 16 14:21:39.540513 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.540407 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" Apr 16 14:21:39.540513 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.540415 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" event={"ID":"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9","Type":"ContainerDied","Data":"0152b99afc4e4eddca8c3e3481ef218e3c1ac1266fd9bc4f5770803fe8866af3"} Apr 16 14:21:39.540513 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.540455 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj" event={"ID":"e2a9f62b-4a94-4f52-a26d-cfc15e1968b9","Type":"ContainerDied","Data":"10fc16f56fc8a7946a59a82175bf72544dbef1e1a656b6468408cec8eb8c0b09"} Apr 16 14:21:39.540513 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.540472 2569 scope.go:117] "RemoveContainer" containerID="0152b99afc4e4eddca8c3e3481ef218e3c1ac1266fd9bc4f5770803fe8866af3" Apr 16 14:21:39.550290 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.550266 2569 scope.go:117] "RemoveContainer" containerID="31e3acf17e0c13bce83ffffd866af4dac5065699c4f0f28df9df133d60522418" Apr 16 14:21:39.558800 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.558775 2569 scope.go:117] "RemoveContainer" containerID="82810b8d68b84045e87f68359607e9582fd5705978eb208ee4f94289696693d9" Apr 16 14:21:39.567101 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.567082 2569 scope.go:117] "RemoveContainer" containerID="0152b99afc4e4eddca8c3e3481ef218e3c1ac1266fd9bc4f5770803fe8866af3" Apr 16 14:21:39.567419 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:21:39.567394 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0152b99afc4e4eddca8c3e3481ef218e3c1ac1266fd9bc4f5770803fe8866af3\": container with ID starting with 0152b99afc4e4eddca8c3e3481ef218e3c1ac1266fd9bc4f5770803fe8866af3 not found: ID does not exist" containerID="0152b99afc4e4eddca8c3e3481ef218e3c1ac1266fd9bc4f5770803fe8866af3" Apr 16 14:21:39.567491 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.567430 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0152b99afc4e4eddca8c3e3481ef218e3c1ac1266fd9bc4f5770803fe8866af3"} err="failed to get container status \"0152b99afc4e4eddca8c3e3481ef218e3c1ac1266fd9bc4f5770803fe8866af3\": rpc error: code = NotFound desc = could not find container \"0152b99afc4e4eddca8c3e3481ef218e3c1ac1266fd9bc4f5770803fe8866af3\": container with ID starting with 0152b99afc4e4eddca8c3e3481ef218e3c1ac1266fd9bc4f5770803fe8866af3 not found: ID does not exist" Apr 16 14:21:39.567491 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.567450 2569 scope.go:117] "RemoveContainer" containerID="31e3acf17e0c13bce83ffffd866af4dac5065699c4f0f28df9df133d60522418" Apr 16 14:21:39.567674 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:21:39.567654 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31e3acf17e0c13bce83ffffd866af4dac5065699c4f0f28df9df133d60522418\": container with ID starting with 31e3acf17e0c13bce83ffffd866af4dac5065699c4f0f28df9df133d60522418 not found: ID does not exist" containerID="31e3acf17e0c13bce83ffffd866af4dac5065699c4f0f28df9df133d60522418" Apr 16 14:21:39.567713 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.567681 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31e3acf17e0c13bce83ffffd866af4dac5065699c4f0f28df9df133d60522418"} err="failed to get container status \"31e3acf17e0c13bce83ffffd866af4dac5065699c4f0f28df9df133d60522418\": rpc error: code = NotFound desc = could not find container \"31e3acf17e0c13bce83ffffd866af4dac5065699c4f0f28df9df133d60522418\": container with ID starting with 31e3acf17e0c13bce83ffffd866af4dac5065699c4f0f28df9df133d60522418 not found: ID does not exist" Apr 16 14:21:39.567713 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.567698 2569 scope.go:117] "RemoveContainer" containerID="82810b8d68b84045e87f68359607e9582fd5705978eb208ee4f94289696693d9" Apr 16 14:21:39.567953 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:21:39.567933 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82810b8d68b84045e87f68359607e9582fd5705978eb208ee4f94289696693d9\": container with ID starting with 82810b8d68b84045e87f68359607e9582fd5705978eb208ee4f94289696693d9 not found: ID does not exist" containerID="82810b8d68b84045e87f68359607e9582fd5705978eb208ee4f94289696693d9" Apr 16 14:21:39.568011 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.567955 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82810b8d68b84045e87f68359607e9582fd5705978eb208ee4f94289696693d9"} err="failed to get container status \"82810b8d68b84045e87f68359607e9582fd5705978eb208ee4f94289696693d9\": rpc error: code = NotFound desc = could not find container \"82810b8d68b84045e87f68359607e9582fd5705978eb208ee4f94289696693d9\": container with ID starting with 82810b8d68b84045e87f68359607e9582fd5705978eb208ee4f94289696693d9 not found: ID does not exist" Apr 16 14:21:39.576202 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.576173 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj"] Apr 16 14:21:39.588023 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.587998 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57bf549665-67rzj"] Apr 16 14:21:39.604896 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:39.604866 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" path="/var/lib/kubelet/pods/e2a9f62b-4a94-4f52-a26d-cfc15e1968b9/volumes" Apr 16 14:21:40.390515 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.390477 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-6465f8dd5f-vllbv"] Apr 16 14:21:40.390909 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.390853 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" containerName="main" Apr 16 14:21:40.390909 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.390864 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" containerName="main" Apr 16 14:21:40.390909 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.390884 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" containerName="tokenizer" Apr 16 14:21:40.390909 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.390890 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" containerName="tokenizer" Apr 16 14:21:40.390909 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.390903 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" containerName="storage-initializer" Apr 16 14:21:40.390909 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.390909 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" containerName="storage-initializer" Apr 16 14:21:40.391098 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.390962 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" containerName="main" Apr 16 14:21:40.391098 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.390970 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2a9f62b-4a94-4f52-a26d-cfc15e1968b9" containerName="tokenizer" Apr 16 14:21:40.395529 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.395511 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6465f8dd5f-vllbv" Apr 16 14:21:40.398796 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.398773 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 14:21:40.400026 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.400007 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-nb7bt\"" Apr 16 14:21:40.403797 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.403774 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6465f8dd5f-vllbv"] Apr 16 14:21:40.503141 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.503098 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44db5f20-7a7a-484e-85ff-c7808be0333b-cert\") pod \"llmisvc-controller-manager-6465f8dd5f-vllbv\" (UID: \"44db5f20-7a7a-484e-85ff-c7808be0333b\") " pod="kserve/llmisvc-controller-manager-6465f8dd5f-vllbv" Apr 16 14:21:40.503353 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.503177 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxrzs\" (UniqueName: \"kubernetes.io/projected/44db5f20-7a7a-484e-85ff-c7808be0333b-kube-api-access-kxrzs\") pod \"llmisvc-controller-manager-6465f8dd5f-vllbv\" (UID: \"44db5f20-7a7a-484e-85ff-c7808be0333b\") " pod="kserve/llmisvc-controller-manager-6465f8dd5f-vllbv" Apr 16 14:21:40.603962 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.603933 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44db5f20-7a7a-484e-85ff-c7808be0333b-cert\") pod \"llmisvc-controller-manager-6465f8dd5f-vllbv\" (UID: \"44db5f20-7a7a-484e-85ff-c7808be0333b\") " pod="kserve/llmisvc-controller-manager-6465f8dd5f-vllbv" Apr 16 14:21:40.604146 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.603990 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxrzs\" (UniqueName: \"kubernetes.io/projected/44db5f20-7a7a-484e-85ff-c7808be0333b-kube-api-access-kxrzs\") pod \"llmisvc-controller-manager-6465f8dd5f-vllbv\" (UID: \"44db5f20-7a7a-484e-85ff-c7808be0333b\") " pod="kserve/llmisvc-controller-manager-6465f8dd5f-vllbv" Apr 16 14:21:40.606232 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.606203 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44db5f20-7a7a-484e-85ff-c7808be0333b-cert\") pod \"llmisvc-controller-manager-6465f8dd5f-vllbv\" (UID: \"44db5f20-7a7a-484e-85ff-c7808be0333b\") " pod="kserve/llmisvc-controller-manager-6465f8dd5f-vllbv" Apr 16 14:21:40.612470 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.612446 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxrzs\" (UniqueName: \"kubernetes.io/projected/44db5f20-7a7a-484e-85ff-c7808be0333b-kube-api-access-kxrzs\") pod \"llmisvc-controller-manager-6465f8dd5f-vllbv\" (UID: \"44db5f20-7a7a-484e-85ff-c7808be0333b\") " pod="kserve/llmisvc-controller-manager-6465f8dd5f-vllbv" Apr 16 14:21:40.706643 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.706609 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6465f8dd5f-vllbv" Apr 16 14:21:40.831076 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:40.831050 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6465f8dd5f-vllbv"] Apr 16 14:21:40.833201 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:21:40.833158 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod44db5f20_7a7a_484e_85ff_c7808be0333b.slice/crio-a37801fdefa2e6dc39ee282a140c809f91c2f585133269b3ff472ee42d89e97d WatchSource:0}: Error finding container a37801fdefa2e6dc39ee282a140c809f91c2f585133269b3ff472ee42d89e97d: Status 404 returned error can't find the container with id a37801fdefa2e6dc39ee282a140c809f91c2f585133269b3ff472ee42d89e97d Apr 16 14:21:41.551215 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:41.551183 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6465f8dd5f-vllbv" event={"ID":"44db5f20-7a7a-484e-85ff-c7808be0333b","Type":"ContainerStarted","Data":"a37801fdefa2e6dc39ee282a140c809f91c2f585133269b3ff472ee42d89e97d"} Apr 16 14:21:44.565842 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:44.565731 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6465f8dd5f-vllbv" event={"ID":"44db5f20-7a7a-484e-85ff-c7808be0333b","Type":"ContainerStarted","Data":"2107d7d15f328009c2dffcbebc52dc491573974f0bbc07073698466164c52631"} Apr 16 14:21:44.565842 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:44.565825 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-6465f8dd5f-vllbv" Apr 16 14:21:44.586063 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:21:44.586000 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-6465f8dd5f-vllbv" podStartSLOduration=1.219099937 podStartE2EDuration="4.585982711s" podCreationTimestamp="2026-04-16 14:21:40 +0000 UTC" firstStartedPulling="2026-04-16 14:21:40.834506471 +0000 UTC m=+1335.842742709" lastFinishedPulling="2026-04-16 14:21:44.201389249 +0000 UTC m=+1339.209625483" observedRunningTime="2026-04-16 14:21:44.583322186 +0000 UTC m=+1339.591558442" watchObservedRunningTime="2026-04-16 14:21:44.585982711 +0000 UTC m=+1339.594218977" Apr 16 14:22:15.573051 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:22:15.573018 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-6465f8dd5f-vllbv" Apr 16 14:24:25.613791 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:24:25.613755 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:24:25.616776 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:24:25.616753 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:27:07.681309 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.681246 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj"] Apr 16 14:27:07.685165 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.685144 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.688211 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.688183 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-xmlr2\"" Apr 16 14:27:07.688361 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.688225 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 14:27:07.689636 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.689617 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xs8cb\"" Apr 16 14:27:07.699853 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.699823 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj"] Apr 16 14:27:07.705000 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.704971 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.705162 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.705016 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzprl\" (UniqueName: \"kubernetes.io/projected/4765f52c-4400-4c15-809c-7b96f1da21ff-kube-api-access-nzprl\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.705162 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.705038 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.705162 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.705061 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4765f52c-4400-4c15-809c-7b96f1da21ff-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.705162 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.705085 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.705162 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.705110 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.806348 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.806313 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.806553 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.806368 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzprl\" (UniqueName: \"kubernetes.io/projected/4765f52c-4400-4c15-809c-7b96f1da21ff-kube-api-access-nzprl\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.806553 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.806399 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.806553 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.806432 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4765f52c-4400-4c15-809c-7b96f1da21ff-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.806553 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.806452 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.806553 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.806478 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.806842 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.806775 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.806842 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.806814 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.806918 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.806849 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.806918 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.806889 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.808923 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.808905 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4765f52c-4400-4c15-809c-7b96f1da21ff-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.815472 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.815449 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzprl\" (UniqueName: \"kubernetes.io/projected/4765f52c-4400-4c15-809c-7b96f1da21ff-kube-api-access-nzprl\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:07.996076 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:07.995978 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:08.137736 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:08.137696 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj"] Apr 16 14:27:08.145148 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:27:08.145111 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4765f52c_4400_4c15_809c_7b96f1da21ff.slice/crio-0a2f356043cf3645a73cef3b1b89a0307bc7cc977cac11f000004be40f6c7204 WatchSource:0}: Error finding container 0a2f356043cf3645a73cef3b1b89a0307bc7cc977cac11f000004be40f6c7204: Status 404 returned error can't find the container with id 0a2f356043cf3645a73cef3b1b89a0307bc7cc977cac11f000004be40f6c7204 Apr 16 14:27:08.147138 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:08.147118 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:27:08.818724 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:08.818689 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" event={"ID":"4765f52c-4400-4c15-809c-7b96f1da21ff","Type":"ContainerStarted","Data":"a4f42dcea9925a4ed1e5e146fe628152673de2e5a2f7de914bfdd6c80f6aebb3"} Apr 16 14:27:08.818724 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:08.818725 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" event={"ID":"4765f52c-4400-4c15-809c-7b96f1da21ff","Type":"ContainerStarted","Data":"0a2f356043cf3645a73cef3b1b89a0307bc7cc977cac11f000004be40f6c7204"} Apr 16 14:27:09.824164 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:09.824131 2569 generic.go:358] "Generic (PLEG): container finished" podID="4765f52c-4400-4c15-809c-7b96f1da21ff" containerID="a4f42dcea9925a4ed1e5e146fe628152673de2e5a2f7de914bfdd6c80f6aebb3" exitCode=0 Apr 16 14:27:09.824672 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:09.824218 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" event={"ID":"4765f52c-4400-4c15-809c-7b96f1da21ff","Type":"ContainerDied","Data":"a4f42dcea9925a4ed1e5e146fe628152673de2e5a2f7de914bfdd6c80f6aebb3"} Apr 16 14:27:10.830527 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:10.830485 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" event={"ID":"4765f52c-4400-4c15-809c-7b96f1da21ff","Type":"ContainerStarted","Data":"241450739c6a379b85fe6221344577e89ceee0bb7a05fcd0a82cae8218c1faf8"} Apr 16 14:27:10.830527 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:10.830531 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" event={"ID":"4765f52c-4400-4c15-809c-7b96f1da21ff","Type":"ContainerStarted","Data":"d9214d4ecdb2134e2612f406a44c6489ba4282b9938b18e9b86a3a40a777fd6a"} Apr 16 14:27:10.831098 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:10.830583 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:10.851741 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:10.851693 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" podStartSLOduration=3.851677184 podStartE2EDuration="3.851677184s" podCreationTimestamp="2026-04-16 14:27:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:27:10.851429207 +0000 UTC m=+1665.859665464" watchObservedRunningTime="2026-04-16 14:27:10.851677184 +0000 UTC m=+1665.859913439" Apr 16 14:27:17.997072 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:17.997032 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:17.997589 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:17.997090 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:17.999848 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:17.999824 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:18.862639 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:18.862605 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:27:39.869109 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:27:39.869077 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:29:25.646319 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:29:25.646289 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:29:25.650703 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:29:25.650676 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:30:08.761430 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:08.761393 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj"] Apr 16 14:30:08.761939 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:08.761679 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" podUID="4765f52c-4400-4c15-809c-7b96f1da21ff" containerName="main" containerID="cri-o://d9214d4ecdb2134e2612f406a44c6489ba4282b9938b18e9b86a3a40a777fd6a" gracePeriod=30 Apr 16 14:30:08.761939 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:08.761717 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" podUID="4765f52c-4400-4c15-809c-7b96f1da21ff" containerName="tokenizer" containerID="cri-o://241450739c6a379b85fe6221344577e89ceee0bb7a05fcd0a82cae8218c1faf8" gracePeriod=30 Apr 16 14:30:08.862205 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:08.862164 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" podUID="4765f52c-4400-4c15-809c-7b96f1da21ff" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.53:8082/healthz\": dial tcp 10.133.0.53:8082: connect: connection refused" Apr 16 14:30:09.505880 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:09.505847 2569 generic.go:358] "Generic (PLEG): container finished" podID="4765f52c-4400-4c15-809c-7b96f1da21ff" containerID="d9214d4ecdb2134e2612f406a44c6489ba4282b9938b18e9b86a3a40a777fd6a" exitCode=0 Apr 16 14:30:09.506075 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:09.505923 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" event={"ID":"4765f52c-4400-4c15-809c-7b96f1da21ff","Type":"ContainerDied","Data":"d9214d4ecdb2134e2612f406a44c6489ba4282b9938b18e9b86a3a40a777fd6a"} Apr 16 14:30:09.866744 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:30:09.866662 2569 logging.go:55] [core] [Channel #256 SubChannel #257]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.53:9003", ServerName: "10.133.0.53:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.53:9003: connect: connection refused" Apr 16 14:30:10.104873 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.104849 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:30:10.252565 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.252529 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4765f52c-4400-4c15-809c-7b96f1da21ff-tls-certs\") pod \"4765f52c-4400-4c15-809c-7b96f1da21ff\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " Apr 16 14:30:10.252770 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.252598 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzprl\" (UniqueName: \"kubernetes.io/projected/4765f52c-4400-4c15-809c-7b96f1da21ff-kube-api-access-nzprl\") pod \"4765f52c-4400-4c15-809c-7b96f1da21ff\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " Apr 16 14:30:10.252770 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.252639 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-cache\") pod \"4765f52c-4400-4c15-809c-7b96f1da21ff\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " Apr 16 14:30:10.252770 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.252680 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-uds\") pod \"4765f52c-4400-4c15-809c-7b96f1da21ff\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " Apr 16 14:30:10.252770 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.252713 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-kserve-provision-location\") pod \"4765f52c-4400-4c15-809c-7b96f1da21ff\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " Apr 16 14:30:10.252770 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.252734 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-tmp\") pod \"4765f52c-4400-4c15-809c-7b96f1da21ff\" (UID: \"4765f52c-4400-4c15-809c-7b96f1da21ff\") " Apr 16 14:30:10.253040 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.252961 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "4765f52c-4400-4c15-809c-7b96f1da21ff" (UID: "4765f52c-4400-4c15-809c-7b96f1da21ff"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:10.253040 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.252977 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "4765f52c-4400-4c15-809c-7b96f1da21ff" (UID: "4765f52c-4400-4c15-809c-7b96f1da21ff"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:10.253147 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.253082 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-cache\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:30:10.253147 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.253102 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-uds\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:30:10.253359 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.253339 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "4765f52c-4400-4c15-809c-7b96f1da21ff" (UID: "4765f52c-4400-4c15-809c-7b96f1da21ff"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:10.253599 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.253575 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4765f52c-4400-4c15-809c-7b96f1da21ff" (UID: "4765f52c-4400-4c15-809c-7b96f1da21ff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:10.254831 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.254807 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4765f52c-4400-4c15-809c-7b96f1da21ff-kube-api-access-nzprl" (OuterVolumeSpecName: "kube-api-access-nzprl") pod "4765f52c-4400-4c15-809c-7b96f1da21ff" (UID: "4765f52c-4400-4c15-809c-7b96f1da21ff"). InnerVolumeSpecName "kube-api-access-nzprl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:30:10.254923 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.254843 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4765f52c-4400-4c15-809c-7b96f1da21ff-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4765f52c-4400-4c15-809c-7b96f1da21ff" (UID: "4765f52c-4400-4c15-809c-7b96f1da21ff"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:30:10.354346 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.354309 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4765f52c-4400-4c15-809c-7b96f1da21ff-tls-certs\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:30:10.354346 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.354338 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nzprl\" (UniqueName: \"kubernetes.io/projected/4765f52c-4400-4c15-809c-7b96f1da21ff-kube-api-access-nzprl\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:30:10.354346 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.354348 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-kserve-provision-location\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:30:10.354346 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.354359 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4765f52c-4400-4c15-809c-7b96f1da21ff-tokenizer-tmp\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:30:10.511819 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.511728 2569 generic.go:358] "Generic (PLEG): container finished" podID="4765f52c-4400-4c15-809c-7b96f1da21ff" containerID="241450739c6a379b85fe6221344577e89ceee0bb7a05fcd0a82cae8218c1faf8" exitCode=0 Apr 16 14:30:10.511819 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.511812 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" Apr 16 14:30:10.512002 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.511812 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" event={"ID":"4765f52c-4400-4c15-809c-7b96f1da21ff","Type":"ContainerDied","Data":"241450739c6a379b85fe6221344577e89ceee0bb7a05fcd0a82cae8218c1faf8"} Apr 16 14:30:10.512002 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.511853 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" event={"ID":"4765f52c-4400-4c15-809c-7b96f1da21ff","Type":"ContainerDied","Data":"0a2f356043cf3645a73cef3b1b89a0307bc7cc977cac11f000004be40f6c7204"} Apr 16 14:30:10.512002 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.511869 2569 scope.go:117] "RemoveContainer" containerID="241450739c6a379b85fe6221344577e89ceee0bb7a05fcd0a82cae8218c1faf8" Apr 16 14:30:10.522438 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.522151 2569 scope.go:117] "RemoveContainer" containerID="d9214d4ecdb2134e2612f406a44c6489ba4282b9938b18e9b86a3a40a777fd6a" Apr 16 14:30:10.531593 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.531574 2569 scope.go:117] "RemoveContainer" containerID="a4f42dcea9925a4ed1e5e146fe628152673de2e5a2f7de914bfdd6c80f6aebb3" Apr 16 14:30:10.539062 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.539038 2569 scope.go:117] "RemoveContainer" containerID="241450739c6a379b85fe6221344577e89ceee0bb7a05fcd0a82cae8218c1faf8" Apr 16 14:30:10.539410 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:30:10.539382 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"241450739c6a379b85fe6221344577e89ceee0bb7a05fcd0a82cae8218c1faf8\": container with ID starting with 241450739c6a379b85fe6221344577e89ceee0bb7a05fcd0a82cae8218c1faf8 not found: ID does not exist" containerID="241450739c6a379b85fe6221344577e89ceee0bb7a05fcd0a82cae8218c1faf8" Apr 16 14:30:10.539515 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.539423 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"241450739c6a379b85fe6221344577e89ceee0bb7a05fcd0a82cae8218c1faf8"} err="failed to get container status \"241450739c6a379b85fe6221344577e89ceee0bb7a05fcd0a82cae8218c1faf8\": rpc error: code = NotFound desc = could not find container \"241450739c6a379b85fe6221344577e89ceee0bb7a05fcd0a82cae8218c1faf8\": container with ID starting with 241450739c6a379b85fe6221344577e89ceee0bb7a05fcd0a82cae8218c1faf8 not found: ID does not exist" Apr 16 14:30:10.539515 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.539466 2569 scope.go:117] "RemoveContainer" containerID="d9214d4ecdb2134e2612f406a44c6489ba4282b9938b18e9b86a3a40a777fd6a" Apr 16 14:30:10.539999 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:30:10.539973 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9214d4ecdb2134e2612f406a44c6489ba4282b9938b18e9b86a3a40a777fd6a\": container with ID starting with d9214d4ecdb2134e2612f406a44c6489ba4282b9938b18e9b86a3a40a777fd6a not found: ID does not exist" containerID="d9214d4ecdb2134e2612f406a44c6489ba4282b9938b18e9b86a3a40a777fd6a" Apr 16 14:30:10.540167 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.540008 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9214d4ecdb2134e2612f406a44c6489ba4282b9938b18e9b86a3a40a777fd6a"} err="failed to get container status \"d9214d4ecdb2134e2612f406a44c6489ba4282b9938b18e9b86a3a40a777fd6a\": rpc error: code = NotFound desc = could not find container \"d9214d4ecdb2134e2612f406a44c6489ba4282b9938b18e9b86a3a40a777fd6a\": container with ID starting with d9214d4ecdb2134e2612f406a44c6489ba4282b9938b18e9b86a3a40a777fd6a not found: ID does not exist" Apr 16 14:30:10.540167 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.540032 2569 scope.go:117] "RemoveContainer" containerID="a4f42dcea9925a4ed1e5e146fe628152673de2e5a2f7de914bfdd6c80f6aebb3" Apr 16 14:30:10.540350 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:30:10.540318 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4f42dcea9925a4ed1e5e146fe628152673de2e5a2f7de914bfdd6c80f6aebb3\": container with ID starting with a4f42dcea9925a4ed1e5e146fe628152673de2e5a2f7de914bfdd6c80f6aebb3 not found: ID does not exist" containerID="a4f42dcea9925a4ed1e5e146fe628152673de2e5a2f7de914bfdd6c80f6aebb3" Apr 16 14:30:10.540414 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.540361 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f42dcea9925a4ed1e5e146fe628152673de2e5a2f7de914bfdd6c80f6aebb3"} err="failed to get container status \"a4f42dcea9925a4ed1e5e146fe628152673de2e5a2f7de914bfdd6c80f6aebb3\": rpc error: code = NotFound desc = could not find container \"a4f42dcea9925a4ed1e5e146fe628152673de2e5a2f7de914bfdd6c80f6aebb3\": container with ID starting with a4f42dcea9925a4ed1e5e146fe628152673de2e5a2f7de914bfdd6c80f6aebb3 not found: ID does not exist" Apr 16 14:30:10.542568 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.542549 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj"] Apr 16 14:30:10.547572 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.547551 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj"] Apr 16 14:30:10.867004 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:10.866898 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelb4kj" podUID="4765f52c-4400-4c15-809c-7b96f1da21ff" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.53:9003\" within 1s: context deadline exceeded" Apr 16 14:30:11.586131 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.586095 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6"] Apr 16 14:30:11.586592 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.586574 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4765f52c-4400-4c15-809c-7b96f1da21ff" containerName="storage-initializer" Apr 16 14:30:11.586697 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.586594 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4765f52c-4400-4c15-809c-7b96f1da21ff" containerName="storage-initializer" Apr 16 14:30:11.586697 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.586619 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4765f52c-4400-4c15-809c-7b96f1da21ff" containerName="main" Apr 16 14:30:11.586697 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.586627 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4765f52c-4400-4c15-809c-7b96f1da21ff" containerName="main" Apr 16 14:30:11.586697 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.586644 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4765f52c-4400-4c15-809c-7b96f1da21ff" containerName="tokenizer" Apr 16 14:30:11.586697 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.586653 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4765f52c-4400-4c15-809c-7b96f1da21ff" containerName="tokenizer" Apr 16 14:30:11.586927 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.586756 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="4765f52c-4400-4c15-809c-7b96f1da21ff" containerName="main" Apr 16 14:30:11.586927 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.586769 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="4765f52c-4400-4c15-809c-7b96f1da21ff" containerName="tokenizer" Apr 16 14:30:11.591531 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.591484 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.597197 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.597171 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-fghjx\"" Apr 16 14:30:11.604060 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.604029 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4765f52c-4400-4c15-809c-7b96f1da21ff" path="/var/lib/kubelet/pods/4765f52c-4400-4c15-809c-7b96f1da21ff/volumes" Apr 16 14:30:11.620437 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.620409 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6"] Apr 16 14:30:11.767771 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.767733 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/83d1ea6b-518f-4aa2-a83d-59086aa382c2-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.767964 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.767776 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mzcd\" (UniqueName: \"kubernetes.io/projected/83d1ea6b-518f-4aa2-a83d-59086aa382c2-kube-api-access-7mzcd\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.767964 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.767810 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/83d1ea6b-518f-4aa2-a83d-59086aa382c2-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.767964 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.767862 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/83d1ea6b-518f-4aa2-a83d-59086aa382c2-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.767964 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.767937 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/83d1ea6b-518f-4aa2-a83d-59086aa382c2-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.768133 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.767966 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/83d1ea6b-518f-4aa2-a83d-59086aa382c2-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.768133 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.767983 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/83d1ea6b-518f-4aa2-a83d-59086aa382c2-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.768133 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.768021 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/83d1ea6b-518f-4aa2-a83d-59086aa382c2-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.768133 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.768072 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/83d1ea6b-518f-4aa2-a83d-59086aa382c2-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.869343 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.869215 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/83d1ea6b-518f-4aa2-a83d-59086aa382c2-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.869343 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.869281 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/83d1ea6b-518f-4aa2-a83d-59086aa382c2-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.869343 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.869306 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/83d1ea6b-518f-4aa2-a83d-59086aa382c2-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.869948 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.869353 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/83d1ea6b-518f-4aa2-a83d-59086aa382c2-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.869948 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.869390 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/83d1ea6b-518f-4aa2-a83d-59086aa382c2-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.869948 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.869453 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/83d1ea6b-518f-4aa2-a83d-59086aa382c2-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.869948 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.869482 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mzcd\" (UniqueName: \"kubernetes.io/projected/83d1ea6b-518f-4aa2-a83d-59086aa382c2-kube-api-access-7mzcd\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.869948 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.869522 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/83d1ea6b-518f-4aa2-a83d-59086aa382c2-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.869948 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.869587 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/83d1ea6b-518f-4aa2-a83d-59086aa382c2-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.869948 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.869724 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/83d1ea6b-518f-4aa2-a83d-59086aa382c2-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.869948 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.869761 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/83d1ea6b-518f-4aa2-a83d-59086aa382c2-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.869948 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.869856 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/83d1ea6b-518f-4aa2-a83d-59086aa382c2-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.870293 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.869978 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/83d1ea6b-518f-4aa2-a83d-59086aa382c2-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.870293 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.870157 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/83d1ea6b-518f-4aa2-a83d-59086aa382c2-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.871704 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.871686 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/83d1ea6b-518f-4aa2-a83d-59086aa382c2-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.871919 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.871903 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/83d1ea6b-518f-4aa2-a83d-59086aa382c2-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.880463 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.880441 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mzcd\" (UniqueName: \"kubernetes.io/projected/83d1ea6b-518f-4aa2-a83d-59086aa382c2-kube-api-access-7mzcd\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.883051 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.883022 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/83d1ea6b-518f-4aa2-a83d-59086aa382c2-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-qvzp6\" (UID: \"83d1ea6b-518f-4aa2-a83d-59086aa382c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:11.903073 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:11.903047 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:12.050797 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:12.050768 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6"] Apr 16 14:30:12.051897 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:30:12.051870 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83d1ea6b_518f_4aa2_a83d_59086aa382c2.slice/crio-44718681c324c0b6d746bb447625d8004f763f93c013a481a8f2a929d18d2484 WatchSource:0}: Error finding container 44718681c324c0b6d746bb447625d8004f763f93c013a481a8f2a929d18d2484: Status 404 returned error can't find the container with id 44718681c324c0b6d746bb447625d8004f763f93c013a481a8f2a929d18d2484 Apr 16 14:30:12.053992 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:12.053970 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 14:30:12.054055 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:12.054025 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 14:30:12.054093 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:12.054054 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 14:30:12.521293 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:12.521243 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" event={"ID":"83d1ea6b-518f-4aa2-a83d-59086aa382c2","Type":"ContainerStarted","Data":"6e96758a56e8d2704372e160aa962c93a187f2cb4bf0cf2d30a13f607a2594b1"} Apr 16 14:30:12.521293 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:12.521291 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" event={"ID":"83d1ea6b-518f-4aa2-a83d-59086aa382c2","Type":"ContainerStarted","Data":"44718681c324c0b6d746bb447625d8004f763f93c013a481a8f2a929d18d2484"} Apr 16 14:30:12.561465 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:12.561411 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" podStartSLOduration=1.561395292 podStartE2EDuration="1.561395292s" podCreationTimestamp="2026-04-16 14:30:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:12.561188931 +0000 UTC m=+1847.569425183" watchObservedRunningTime="2026-04-16 14:30:12.561395292 +0000 UTC m=+1847.569631614" Apr 16 14:30:12.903514 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:12.903436 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:13.908579 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:13.908551 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:14.531906 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:14.531871 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:14.532999 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:14.532983 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-qvzp6" Apr 16 14:30:28.833268 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:28.833211 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn"] Apr 16 14:30:28.836937 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:28.836914 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:28.840771 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:28.840748 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xs8cb\"" Apr 16 14:30:28.840889 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:28.840770 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 14:30:28.846811 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:28.846787 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn"] Apr 16 14:30:28.929349 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:28.929311 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-home\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:28.929349 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:28.929350 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k498z\" (UniqueName: \"kubernetes.io/projected/506b4bbb-0a06-4109-b8d9-c11a5e887d40-kube-api-access-k498z\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:28.929619 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:28.929373 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/506b4bbb-0a06-4109-b8d9-c11a5e887d40-tls-certs\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:28.929619 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:28.929433 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:28.929619 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:28.929527 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-dshm\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:28.929619 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:28.929605 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-model-cache\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:29.030836 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.030800 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-dshm\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:29.031039 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.030866 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-model-cache\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:29.031039 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.030917 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-home\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:29.031039 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.030942 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k498z\" (UniqueName: \"kubernetes.io/projected/506b4bbb-0a06-4109-b8d9-c11a5e887d40-kube-api-access-k498z\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:29.031039 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.030969 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/506b4bbb-0a06-4109-b8d9-c11a5e887d40-tls-certs\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:29.031039 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.030995 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:29.031358 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.031330 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-home\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:29.031409 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.031376 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-model-cache\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:29.031452 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.031413 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:29.033143 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.033118 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-dshm\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:29.033419 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.033401 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/506b4bbb-0a06-4109-b8d9-c11a5e887d40-tls-certs\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:29.040285 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.040238 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k498z\" (UniqueName: \"kubernetes.io/projected/506b4bbb-0a06-4109-b8d9-c11a5e887d40-kube-api-access-k498z\") pod \"scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:29.148674 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.148583 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:29.292943 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.292916 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn"] Apr 16 14:30:29.294823 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:30:29.294783 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod506b4bbb_0a06_4109_b8d9_c11a5e887d40.slice/crio-b85cf9baf361062cc8a7278bfdae12094af7737011cd7be953aa7d2599b54d84 WatchSource:0}: Error finding container b85cf9baf361062cc8a7278bfdae12094af7737011cd7be953aa7d2599b54d84: Status 404 returned error can't find the container with id b85cf9baf361062cc8a7278bfdae12094af7737011cd7be953aa7d2599b54d84 Apr 16 14:30:29.590998 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.590960 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" event={"ID":"506b4bbb-0a06-4109-b8d9-c11a5e887d40","Type":"ContainerStarted","Data":"9e938603dcf34923239d8894a832a3961913bf6b7e62dd71793718160ff99364"} Apr 16 14:30:29.590998 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:29.591002 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" event={"ID":"506b4bbb-0a06-4109-b8d9-c11a5e887d40","Type":"ContainerStarted","Data":"b85cf9baf361062cc8a7278bfdae12094af7737011cd7be953aa7d2599b54d84"} Apr 16 14:30:33.608931 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:33.608897 2569 generic.go:358] "Generic (PLEG): container finished" podID="506b4bbb-0a06-4109-b8d9-c11a5e887d40" containerID="9e938603dcf34923239d8894a832a3961913bf6b7e62dd71793718160ff99364" exitCode=0 Apr 16 14:30:33.609344 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:33.608969 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" event={"ID":"506b4bbb-0a06-4109-b8d9-c11a5e887d40","Type":"ContainerDied","Data":"9e938603dcf34923239d8894a832a3961913bf6b7e62dd71793718160ff99364"} Apr 16 14:30:34.614435 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:34.614403 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" event={"ID":"506b4bbb-0a06-4109-b8d9-c11a5e887d40","Type":"ContainerStarted","Data":"d9084122811db2aa5d966504a1e906fb6ec855d4175cd18bee866c7aeaf50a33"} Apr 16 14:30:34.635621 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:34.635568 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" podStartSLOduration=6.635554171 podStartE2EDuration="6.635554171s" podCreationTimestamp="2026-04-16 14:30:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:34.633551254 +0000 UTC m=+1869.641787510" watchObservedRunningTime="2026-04-16 14:30:34.635554171 +0000 UTC m=+1869.643790424" Apr 16 14:30:39.149123 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:39.149081 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:39.149559 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:39.149234 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:39.161753 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:39.161730 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:39.644123 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:39.644097 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:30:47.331490 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.331406 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt"] Apr 16 14:30:47.374467 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.374427 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt"] Apr 16 14:30:47.374643 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.374560 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.377692 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.377668 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 14:30:47.377692 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.377681 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-sw49k\"" Apr 16 14:30:47.501236 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.501203 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.501444 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.501293 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9defa-c326-4a98-9f31-47ce53226a65-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.501444 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.501383 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv4pk\" (UniqueName: \"kubernetes.io/projected/26d9defa-c326-4a98-9f31-47ce53226a65-kube-api-access-wv4pk\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.501581 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.501443 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.501581 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.501473 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.501581 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.501517 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.602120 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.602035 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.602120 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.602071 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.602367 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.602173 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.602367 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.602199 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.602367 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.602233 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9defa-c326-4a98-9f31-47ce53226a65-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.602367 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.602362 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wv4pk\" (UniqueName: \"kubernetes.io/projected/26d9defa-c326-4a98-9f31-47ce53226a65-kube-api-access-wv4pk\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.602592 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.602522 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.602592 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.602540 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.602701 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.602612 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.602701 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.602659 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.604988 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.604961 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9defa-c326-4a98-9f31-47ce53226a65-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.613696 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.613667 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv4pk\" (UniqueName: \"kubernetes.io/projected/26d9defa-c326-4a98-9f31-47ce53226a65-kube-api-access-wv4pk\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.685153 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.685100 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:47.816050 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:47.816019 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt"] Apr 16 14:30:47.817768 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:30:47.817733 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26d9defa_c326_4a98_9f31_47ce53226a65.slice/crio-d925240e8982837c442578216edc6d89a7e9e991bf37814763171e9d16f02f74 WatchSource:0}: Error finding container d925240e8982837c442578216edc6d89a7e9e991bf37814763171e9d16f02f74: Status 404 returned error can't find the container with id d925240e8982837c442578216edc6d89a7e9e991bf37814763171e9d16f02f74 Apr 16 14:30:48.666852 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:48.666818 2569 generic.go:358] "Generic (PLEG): container finished" podID="26d9defa-c326-4a98-9f31-47ce53226a65" containerID="d93cf853dc0e6513a74e3df9d5d38a49ee52ae91f2a153cc695a82193691ee69" exitCode=0 Apr 16 14:30:48.667217 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:48.666902 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" event={"ID":"26d9defa-c326-4a98-9f31-47ce53226a65","Type":"ContainerDied","Data":"d93cf853dc0e6513a74e3df9d5d38a49ee52ae91f2a153cc695a82193691ee69"} Apr 16 14:30:48.667217 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:48.666950 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" event={"ID":"26d9defa-c326-4a98-9f31-47ce53226a65","Type":"ContainerStarted","Data":"d925240e8982837c442578216edc6d89a7e9e991bf37814763171e9d16f02f74"} Apr 16 14:30:49.672580 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:49.672542 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" event={"ID":"26d9defa-c326-4a98-9f31-47ce53226a65","Type":"ContainerStarted","Data":"05ba2de338da74bcd9f2502ebf9cea40ab152f7f0e8224dc97d413c63446499c"} Apr 16 14:30:49.672580 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:49.672587 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" event={"ID":"26d9defa-c326-4a98-9f31-47ce53226a65","Type":"ContainerStarted","Data":"df1668f4fdb576192a22269968b11260e33502adbf38ad4bced17a5ebdb09862"} Apr 16 14:30:49.673008 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:49.672674 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:49.694706 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:49.694657 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" podStartSLOduration=2.6946421860000003 podStartE2EDuration="2.694642186s" podCreationTimestamp="2026-04-16 14:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:49.692054073 +0000 UTC m=+1884.700290328" watchObservedRunningTime="2026-04-16 14:30:49.694642186 +0000 UTC m=+1884.702878442" Apr 16 14:30:57.685614 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:57.685571 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:57.686190 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:57.685705 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:57.688898 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:57.688873 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:30:57.706111 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:30:57.706081 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:31:10.932387 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:10.932339 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn"] Apr 16 14:31:10.932852 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:10.932636 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" podUID="506b4bbb-0a06-4109-b8d9-c11a5e887d40" containerName="main" containerID="cri-o://d9084122811db2aa5d966504a1e906fb6ec855d4175cd18bee866c7aeaf50a33" gracePeriod=30 Apr 16 14:31:11.187200 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.187134 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:31:11.216988 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.216957 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-dshm\") pod \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " Apr 16 14:31:11.217181 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.217040 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-kserve-provision-location\") pod \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " Apr 16 14:31:11.217181 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.217066 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k498z\" (UniqueName: \"kubernetes.io/projected/506b4bbb-0a06-4109-b8d9-c11a5e887d40-kube-api-access-k498z\") pod \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " Apr 16 14:31:11.217181 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.217081 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-model-cache\") pod \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " Apr 16 14:31:11.217181 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.217109 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-home\") pod \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " Apr 16 14:31:11.217436 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.217267 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/506b4bbb-0a06-4109-b8d9-c11a5e887d40-tls-certs\") pod \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\" (UID: \"506b4bbb-0a06-4109-b8d9-c11a5e887d40\") " Apr 16 14:31:11.217436 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.217395 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-home" (OuterVolumeSpecName: "home") pod "506b4bbb-0a06-4109-b8d9-c11a5e887d40" (UID: "506b4bbb-0a06-4109-b8d9-c11a5e887d40"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:11.217544 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.217417 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-model-cache" (OuterVolumeSpecName: "model-cache") pod "506b4bbb-0a06-4109-b8d9-c11a5e887d40" (UID: "506b4bbb-0a06-4109-b8d9-c11a5e887d40"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:11.217667 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.217649 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-model-cache\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:31:11.217720 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.217674 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-home\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:31:11.219332 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.219225 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/506b4bbb-0a06-4109-b8d9-c11a5e887d40-kube-api-access-k498z" (OuterVolumeSpecName: "kube-api-access-k498z") pod "506b4bbb-0a06-4109-b8d9-c11a5e887d40" (UID: "506b4bbb-0a06-4109-b8d9-c11a5e887d40"). InnerVolumeSpecName "kube-api-access-k498z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:31:11.219480 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.219458 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-dshm" (OuterVolumeSpecName: "dshm") pod "506b4bbb-0a06-4109-b8d9-c11a5e887d40" (UID: "506b4bbb-0a06-4109-b8d9-c11a5e887d40"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:11.219743 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.219724 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506b4bbb-0a06-4109-b8d9-c11a5e887d40-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "506b4bbb-0a06-4109-b8d9-c11a5e887d40" (UID: "506b4bbb-0a06-4109-b8d9-c11a5e887d40"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:31:11.272501 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.272452 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "506b4bbb-0a06-4109-b8d9-c11a5e887d40" (UID: "506b4bbb-0a06-4109-b8d9-c11a5e887d40"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:11.318186 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.318152 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/506b4bbb-0a06-4109-b8d9-c11a5e887d40-tls-certs\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:31:11.318186 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.318183 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-dshm\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:31:11.318186 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.318193 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/506b4bbb-0a06-4109-b8d9-c11a5e887d40-kserve-provision-location\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:31:11.318470 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.318203 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k498z\" (UniqueName: \"kubernetes.io/projected/506b4bbb-0a06-4109-b8d9-c11a5e887d40-kube-api-access-k498z\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:31:11.758297 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.758261 2569 generic.go:358] "Generic (PLEG): container finished" podID="506b4bbb-0a06-4109-b8d9-c11a5e887d40" containerID="d9084122811db2aa5d966504a1e906fb6ec855d4175cd18bee866c7aeaf50a33" exitCode=0 Apr 16 14:31:11.758476 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.758335 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" Apr 16 14:31:11.758476 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.758348 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" event={"ID":"506b4bbb-0a06-4109-b8d9-c11a5e887d40","Type":"ContainerDied","Data":"d9084122811db2aa5d966504a1e906fb6ec855d4175cd18bee866c7aeaf50a33"} Apr 16 14:31:11.758476 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.758391 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn" event={"ID":"506b4bbb-0a06-4109-b8d9-c11a5e887d40","Type":"ContainerDied","Data":"b85cf9baf361062cc8a7278bfdae12094af7737011cd7be953aa7d2599b54d84"} Apr 16 14:31:11.758476 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.758407 2569 scope.go:117] "RemoveContainer" containerID="d9084122811db2aa5d966504a1e906fb6ec855d4175cd18bee866c7aeaf50a33" Apr 16 14:31:11.767699 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.767678 2569 scope.go:117] "RemoveContainer" containerID="9e938603dcf34923239d8894a832a3961913bf6b7e62dd71793718160ff99364" Apr 16 14:31:11.777204 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.777177 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn"] Apr 16 14:31:11.787427 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.787398 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7dc6c7b87b-5jpkn"] Apr 16 14:31:11.831885 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.831862 2569 scope.go:117] "RemoveContainer" containerID="d9084122811db2aa5d966504a1e906fb6ec855d4175cd18bee866c7aeaf50a33" Apr 16 14:31:11.832195 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:31:11.832174 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9084122811db2aa5d966504a1e906fb6ec855d4175cd18bee866c7aeaf50a33\": container with ID starting with d9084122811db2aa5d966504a1e906fb6ec855d4175cd18bee866c7aeaf50a33 not found: ID does not exist" containerID="d9084122811db2aa5d966504a1e906fb6ec855d4175cd18bee866c7aeaf50a33" Apr 16 14:31:11.832269 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.832208 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9084122811db2aa5d966504a1e906fb6ec855d4175cd18bee866c7aeaf50a33"} err="failed to get container status \"d9084122811db2aa5d966504a1e906fb6ec855d4175cd18bee866c7aeaf50a33\": rpc error: code = NotFound desc = could not find container \"d9084122811db2aa5d966504a1e906fb6ec855d4175cd18bee866c7aeaf50a33\": container with ID starting with d9084122811db2aa5d966504a1e906fb6ec855d4175cd18bee866c7aeaf50a33 not found: ID does not exist" Apr 16 14:31:11.832269 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.832237 2569 scope.go:117] "RemoveContainer" containerID="9e938603dcf34923239d8894a832a3961913bf6b7e62dd71793718160ff99364" Apr 16 14:31:11.832542 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:31:11.832522 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e938603dcf34923239d8894a832a3961913bf6b7e62dd71793718160ff99364\": container with ID starting with 9e938603dcf34923239d8894a832a3961913bf6b7e62dd71793718160ff99364 not found: ID does not exist" containerID="9e938603dcf34923239d8894a832a3961913bf6b7e62dd71793718160ff99364" Apr 16 14:31:11.832585 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:11.832549 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e938603dcf34923239d8894a832a3961913bf6b7e62dd71793718160ff99364"} err="failed to get container status \"9e938603dcf34923239d8894a832a3961913bf6b7e62dd71793718160ff99364\": rpc error: code = NotFound desc = could not find container \"9e938603dcf34923239d8894a832a3961913bf6b7e62dd71793718160ff99364\": container with ID starting with 9e938603dcf34923239d8894a832a3961913bf6b7e62dd71793718160ff99364 not found: ID does not exist" Apr 16 14:31:13.603807 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:13.603774 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="506b4bbb-0a06-4109-b8d9-c11a5e887d40" path="/var/lib/kubelet/pods/506b4bbb-0a06-4109-b8d9-c11a5e887d40/volumes" Apr 16 14:31:19.715187 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:31:19.715156 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:33:18.688347 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:18.688262 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt"] Apr 16 14:33:18.688826 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:18.688616 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" podUID="26d9defa-c326-4a98-9f31-47ce53226a65" containerName="main" containerID="cri-o://df1668f4fdb576192a22269968b11260e33502adbf38ad4bced17a5ebdb09862" gracePeriod=30 Apr 16 14:33:18.688826 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:18.688658 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" podUID="26d9defa-c326-4a98-9f31-47ce53226a65" containerName="tokenizer" containerID="cri-o://05ba2de338da74bcd9f2502ebf9cea40ab152f7f0e8224dc97d413c63446499c" gracePeriod=30 Apr 16 14:33:19.262735 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:19.262698 2569 generic.go:358] "Generic (PLEG): container finished" podID="26d9defa-c326-4a98-9f31-47ce53226a65" containerID="df1668f4fdb576192a22269968b11260e33502adbf38ad4bced17a5ebdb09862" exitCode=0 Apr 16 14:33:19.262976 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:19.262758 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" event={"ID":"26d9defa-c326-4a98-9f31-47ce53226a65","Type":"ContainerDied","Data":"df1668f4fdb576192a22269968b11260e33502adbf38ad4bced17a5ebdb09862"} Apr 16 14:33:19.713492 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:33:19.713462 2569 logging.go:55] [core] [Channel #339 SubChannel #340]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.56:9003", ServerName: "10.133.0.56:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.56:9003: connect: connection refused" Apr 16 14:33:20.035004 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.034978 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:33:20.188350 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.188316 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-uds\") pod \"26d9defa-c326-4a98-9f31-47ce53226a65\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " Apr 16 14:33:20.188551 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.188372 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-tmp\") pod \"26d9defa-c326-4a98-9f31-47ce53226a65\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " Apr 16 14:33:20.188551 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.188400 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-kserve-provision-location\") pod \"26d9defa-c326-4a98-9f31-47ce53226a65\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " Apr 16 14:33:20.188551 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.188461 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv4pk\" (UniqueName: \"kubernetes.io/projected/26d9defa-c326-4a98-9f31-47ce53226a65-kube-api-access-wv4pk\") pod \"26d9defa-c326-4a98-9f31-47ce53226a65\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " Apr 16 14:33:20.188551 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.188494 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9defa-c326-4a98-9f31-47ce53226a65-tls-certs\") pod \"26d9defa-c326-4a98-9f31-47ce53226a65\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " Apr 16 14:33:20.188755 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.188552 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-cache\") pod \"26d9defa-c326-4a98-9f31-47ce53226a65\" (UID: \"26d9defa-c326-4a98-9f31-47ce53226a65\") " Apr 16 14:33:20.188755 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.188589 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "26d9defa-c326-4a98-9f31-47ce53226a65" (UID: "26d9defa-c326-4a98-9f31-47ce53226a65"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:20.188854 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.188833 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-uds\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:33:20.188915 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.188829 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "26d9defa-c326-4a98-9f31-47ce53226a65" (UID: "26d9defa-c326-4a98-9f31-47ce53226a65"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:20.188915 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.188840 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "26d9defa-c326-4a98-9f31-47ce53226a65" (UID: "26d9defa-c326-4a98-9f31-47ce53226a65"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:20.189167 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.189145 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "26d9defa-c326-4a98-9f31-47ce53226a65" (UID: "26d9defa-c326-4a98-9f31-47ce53226a65"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:20.190689 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.190661 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d9defa-c326-4a98-9f31-47ce53226a65-kube-api-access-wv4pk" (OuterVolumeSpecName: "kube-api-access-wv4pk") pod "26d9defa-c326-4a98-9f31-47ce53226a65" (UID: "26d9defa-c326-4a98-9f31-47ce53226a65"). InnerVolumeSpecName "kube-api-access-wv4pk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:33:20.190790 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.190722 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d9defa-c326-4a98-9f31-47ce53226a65-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "26d9defa-c326-4a98-9f31-47ce53226a65" (UID: "26d9defa-c326-4a98-9f31-47ce53226a65"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:20.268820 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.268729 2569 generic.go:358] "Generic (PLEG): container finished" podID="26d9defa-c326-4a98-9f31-47ce53226a65" containerID="05ba2de338da74bcd9f2502ebf9cea40ab152f7f0e8224dc97d413c63446499c" exitCode=0 Apr 16 14:33:20.268820 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.268809 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" Apr 16 14:33:20.269017 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.268810 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" event={"ID":"26d9defa-c326-4a98-9f31-47ce53226a65","Type":"ContainerDied","Data":"05ba2de338da74bcd9f2502ebf9cea40ab152f7f0e8224dc97d413c63446499c"} Apr 16 14:33:20.269017 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.268854 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" event={"ID":"26d9defa-c326-4a98-9f31-47ce53226a65","Type":"ContainerDied","Data":"d925240e8982837c442578216edc6d89a7e9e991bf37814763171e9d16f02f74"} Apr 16 14:33:20.269017 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.268870 2569 scope.go:117] "RemoveContainer" containerID="05ba2de338da74bcd9f2502ebf9cea40ab152f7f0e8224dc97d413c63446499c" Apr 16 14:33:20.278982 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.278966 2569 scope.go:117] "RemoveContainer" containerID="df1668f4fdb576192a22269968b11260e33502adbf38ad4bced17a5ebdb09862" Apr 16 14:33:20.287037 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.287019 2569 scope.go:117] "RemoveContainer" containerID="d93cf853dc0e6513a74e3df9d5d38a49ee52ae91f2a153cc695a82193691ee69" Apr 16 14:33:20.289312 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.289292 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9defa-c326-4a98-9f31-47ce53226a65-tls-certs\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:33:20.289405 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.289316 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-cache\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:33:20.289405 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.289331 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-tokenizer-tmp\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:33:20.289405 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.289350 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26d9defa-c326-4a98-9f31-47ce53226a65-kserve-provision-location\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:33:20.289405 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.289364 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wv4pk\" (UniqueName: \"kubernetes.io/projected/26d9defa-c326-4a98-9f31-47ce53226a65-kube-api-access-wv4pk\") on node \"ip-10-0-128-60.ec2.internal\" DevicePath \"\"" Apr 16 14:33:20.294920 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.294902 2569 scope.go:117] "RemoveContainer" containerID="05ba2de338da74bcd9f2502ebf9cea40ab152f7f0e8224dc97d413c63446499c" Apr 16 14:33:20.295181 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:33:20.295151 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05ba2de338da74bcd9f2502ebf9cea40ab152f7f0e8224dc97d413c63446499c\": container with ID starting with 05ba2de338da74bcd9f2502ebf9cea40ab152f7f0e8224dc97d413c63446499c not found: ID does not exist" containerID="05ba2de338da74bcd9f2502ebf9cea40ab152f7f0e8224dc97d413c63446499c" Apr 16 14:33:20.295230 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.295191 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05ba2de338da74bcd9f2502ebf9cea40ab152f7f0e8224dc97d413c63446499c"} err="failed to get container status \"05ba2de338da74bcd9f2502ebf9cea40ab152f7f0e8224dc97d413c63446499c\": rpc error: code = NotFound desc = could not find container \"05ba2de338da74bcd9f2502ebf9cea40ab152f7f0e8224dc97d413c63446499c\": container with ID starting with 05ba2de338da74bcd9f2502ebf9cea40ab152f7f0e8224dc97d413c63446499c not found: ID does not exist" Apr 16 14:33:20.295230 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.295209 2569 scope.go:117] "RemoveContainer" containerID="df1668f4fdb576192a22269968b11260e33502adbf38ad4bced17a5ebdb09862" Apr 16 14:33:20.295457 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:33:20.295437 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df1668f4fdb576192a22269968b11260e33502adbf38ad4bced17a5ebdb09862\": container with ID starting with df1668f4fdb576192a22269968b11260e33502adbf38ad4bced17a5ebdb09862 not found: ID does not exist" containerID="df1668f4fdb576192a22269968b11260e33502adbf38ad4bced17a5ebdb09862" Apr 16 14:33:20.295499 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.295465 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df1668f4fdb576192a22269968b11260e33502adbf38ad4bced17a5ebdb09862"} err="failed to get container status \"df1668f4fdb576192a22269968b11260e33502adbf38ad4bced17a5ebdb09862\": rpc error: code = NotFound desc = could not find container \"df1668f4fdb576192a22269968b11260e33502adbf38ad4bced17a5ebdb09862\": container with ID starting with df1668f4fdb576192a22269968b11260e33502adbf38ad4bced17a5ebdb09862 not found: ID does not exist" Apr 16 14:33:20.295499 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.295481 2569 scope.go:117] "RemoveContainer" containerID="d93cf853dc0e6513a74e3df9d5d38a49ee52ae91f2a153cc695a82193691ee69" Apr 16 14:33:20.295714 ip-10-0-128-60 kubenswrapper[2569]: E0416 14:33:20.295700 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d93cf853dc0e6513a74e3df9d5d38a49ee52ae91f2a153cc695a82193691ee69\": container with ID starting with d93cf853dc0e6513a74e3df9d5d38a49ee52ae91f2a153cc695a82193691ee69 not found: ID does not exist" containerID="d93cf853dc0e6513a74e3df9d5d38a49ee52ae91f2a153cc695a82193691ee69" Apr 16 14:33:20.295764 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.295717 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d93cf853dc0e6513a74e3df9d5d38a49ee52ae91f2a153cc695a82193691ee69"} err="failed to get container status \"d93cf853dc0e6513a74e3df9d5d38a49ee52ae91f2a153cc695a82193691ee69\": rpc error: code = NotFound desc = could not find container \"d93cf853dc0e6513a74e3df9d5d38a49ee52ae91f2a153cc695a82193691ee69\": container with ID starting with d93cf853dc0e6513a74e3df9d5d38a49ee52ae91f2a153cc695a82193691ee69 not found: ID does not exist" Apr 16 14:33:20.307339 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.307318 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt"] Apr 16 14:33:20.313705 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.313681 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt"] Apr 16 14:33:20.713402 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:20.713365 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7c7b4f8bbfrfzt" podUID="26d9defa-c326-4a98-9f31-47ce53226a65" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.56:9003\" within 1s: context deadline exceeded" Apr 16 14:33:21.603929 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:21.603896 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d9defa-c326-4a98-9f31-47ce53226a65" path="/var/lib/kubelet/pods/26d9defa-c326-4a98-9f31-47ce53226a65/volumes" Apr 16 14:33:34.962571 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:34.962529 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-zkrpf_e90a9704-cfe7-474f-aaaf-2b098a4430fe/istio-proxy/0.log" Apr 16 14:33:34.990757 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:34.990724 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-qvzp6_83d1ea6b-518f-4aa2-a83d-59086aa382c2/istio-proxy/0.log" Apr 16 14:33:36.105986 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:36.105956 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-zkrpf_e90a9704-cfe7-474f-aaaf-2b098a4430fe/istio-proxy/0.log" Apr 16 14:33:36.124070 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:36.124033 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-qvzp6_83d1ea6b-518f-4aa2-a83d-59086aa382c2/istio-proxy/0.log" Apr 16 14:33:37.178232 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:37.178205 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-zkrpf_e90a9704-cfe7-474f-aaaf-2b098a4430fe/istio-proxy/0.log" Apr 16 14:33:37.194811 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:37.194783 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-qvzp6_83d1ea6b-518f-4aa2-a83d-59086aa382c2/istio-proxy/0.log" Apr 16 14:33:38.240576 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:38.240540 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-zkrpf_e90a9704-cfe7-474f-aaaf-2b098a4430fe/istio-proxy/0.log" Apr 16 14:33:38.256806 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:38.256778 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-qvzp6_83d1ea6b-518f-4aa2-a83d-59086aa382c2/istio-proxy/0.log" Apr 16 14:33:39.285593 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:39.285563 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-zkrpf_e90a9704-cfe7-474f-aaaf-2b098a4430fe/istio-proxy/0.log" Apr 16 14:33:39.313742 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:39.313715 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-qvzp6_83d1ea6b-518f-4aa2-a83d-59086aa382c2/istio-proxy/0.log" Apr 16 14:33:40.321715 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:40.321687 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-zkrpf_e90a9704-cfe7-474f-aaaf-2b098a4430fe/istio-proxy/0.log" Apr 16 14:33:40.349205 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:40.349174 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-qvzp6_83d1ea6b-518f-4aa2-a83d-59086aa382c2/istio-proxy/0.log" Apr 16 14:33:41.433538 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:41.433508 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-zkrpf_e90a9704-cfe7-474f-aaaf-2b098a4430fe/istio-proxy/0.log" Apr 16 14:33:41.451186 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:41.451162 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-qvzp6_83d1ea6b-518f-4aa2-a83d-59086aa382c2/istio-proxy/0.log" Apr 16 14:33:42.453088 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:42.453058 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-zkrpf_e90a9704-cfe7-474f-aaaf-2b098a4430fe/istio-proxy/0.log" Apr 16 14:33:42.471117 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:42.471092 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-qvzp6_83d1ea6b-518f-4aa2-a83d-59086aa382c2/istio-proxy/0.log" Apr 16 14:33:43.467972 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:43.467943 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-zkrpf_e90a9704-cfe7-474f-aaaf-2b098a4430fe/istio-proxy/0.log" Apr 16 14:33:43.483472 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:43.483445 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-qvzp6_83d1ea6b-518f-4aa2-a83d-59086aa382c2/istio-proxy/0.log" Apr 16 14:33:44.488542 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:44.488459 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-zkrpf_e90a9704-cfe7-474f-aaaf-2b098a4430fe/istio-proxy/0.log" Apr 16 14:33:44.504704 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:44.504676 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-qvzp6_83d1ea6b-518f-4aa2-a83d-59086aa382c2/istio-proxy/0.log" Apr 16 14:33:45.537179 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:45.537152 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-zkrpf_e90a9704-cfe7-474f-aaaf-2b098a4430fe/istio-proxy/0.log" Apr 16 14:33:45.557726 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:45.557698 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-qvzp6_83d1ea6b-518f-4aa2-a83d-59086aa382c2/istio-proxy/0.log" Apr 16 14:33:46.605268 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:46.605236 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-zkrpf_e90a9704-cfe7-474f-aaaf-2b098a4430fe/istio-proxy/0.log" Apr 16 14:33:46.624383 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:46.624355 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-qvzp6_83d1ea6b-518f-4aa2-a83d-59086aa382c2/istio-proxy/0.log" Apr 16 14:33:47.814510 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:47.814483 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-zkrpf_e90a9704-cfe7-474f-aaaf-2b098a4430fe/istio-proxy/0.log" Apr 16 14:33:47.833632 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:47.833605 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-qvzp6_83d1ea6b-518f-4aa2-a83d-59086aa382c2/istio-proxy/0.log" Apr 16 14:33:48.972062 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:48.972031 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-zkrpf_e90a9704-cfe7-474f-aaaf-2b098a4430fe/istio-proxy/0.log" Apr 16 14:33:48.996807 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:48.996778 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-qvzp6_83d1ea6b-518f-4aa2-a83d-59086aa382c2/istio-proxy/0.log" Apr 16 14:33:50.396391 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:50.396357 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5b8555f84-mbcn4_e39140fd-a9c8-42bc-8af9-0ddd8cd8addc/router/0.log" Apr 16 14:33:51.444731 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:51.444703 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5b8555f84-mbcn4_e39140fd-a9c8-42bc-8af9-0ddd8cd8addc/router/0.log" Apr 16 14:33:52.739919 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:52.739885 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-6wqhq_589a9481-338e-4fcb-9b1a-272930da4805/authorino/0.log" Apr 16 14:33:52.766880 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:52.766857 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-jvvfr_4c9bd71c-1397-46a5-a6df-59c7835d9635/manager/0.log" Apr 16 14:33:52.781491 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:52.781472 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-ss6p5_94ae3b08-04b1-494c-80f6-02db41384aca/kuadrant-console-plugin/0.log" Apr 16 14:33:58.211397 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:58.211368 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6dbn9_087e7b97-349b-4c1c-a604-82fcaaa88534/global-pull-secret-syncer/0.log" Apr 16 14:33:58.293711 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:58.293682 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9wmsd_55dc093f-e774-41c5-a0c2-2eaa10a6e366/konnectivity-agent/0.log" Apr 16 14:33:58.381635 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:33:58.381605 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-60.ec2.internal_0046dd3af4f2ad1568ea51124d053499/haproxy/0.log" Apr 16 14:34:02.469125 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:02.469097 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-6wqhq_589a9481-338e-4fcb-9b1a-272930da4805/authorino/0.log" Apr 16 14:34:02.527489 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:02.527459 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-jvvfr_4c9bd71c-1397-46a5-a6df-59c7835d9635/manager/0.log" Apr 16 14:34:02.553009 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:02.552979 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-ss6p5_94ae3b08-04b1-494c-80f6-02db41384aca/kuadrant-console-plugin/0.log" Apr 16 14:34:03.527574 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:03.527539 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_93d99919-c5f5-4373-a5c6-2329f656103c/alertmanager/0.log" Apr 16 14:34:03.554487 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:03.554459 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_93d99919-c5f5-4373-a5c6-2329f656103c/config-reloader/0.log" Apr 16 14:34:03.576047 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:03.576019 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_93d99919-c5f5-4373-a5c6-2329f656103c/kube-rbac-proxy-web/0.log" Apr 16 14:34:03.598860 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:03.598837 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_93d99919-c5f5-4373-a5c6-2329f656103c/kube-rbac-proxy/0.log" Apr 16 14:34:03.624574 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:03.624542 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_93d99919-c5f5-4373-a5c6-2329f656103c/kube-rbac-proxy-metric/0.log" Apr 16 14:34:03.646347 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:03.646318 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_93d99919-c5f5-4373-a5c6-2329f656103c/prom-label-proxy/0.log" Apr 16 14:34:03.670297 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:03.670275 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_93d99919-c5f5-4373-a5c6-2329f656103c/init-config-reloader/0.log" Apr 16 14:34:03.845750 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:03.845670 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-99jzx_1c788be0-8df6-44b5-9585-852e5bae9147/monitoring-plugin/0.log" Apr 16 14:34:04.046171 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:04.046138 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w7qjp_ea67638c-bae3-407e-bb50-aeeae5ea8f7d/node-exporter/0.log" Apr 16 14:34:04.067006 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:04.066980 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w7qjp_ea67638c-bae3-407e-bb50-aeeae5ea8f7d/kube-rbac-proxy/0.log" Apr 16 14:34:04.087163 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:04.087140 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w7qjp_ea67638c-bae3-407e-bb50-aeeae5ea8f7d/init-textfile/0.log" Apr 16 14:34:04.498453 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:04.498421 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-p5mcv_c80c5960-118a-4109-8282-3f4b1769aa2f/prometheus-operator-admission-webhook/0.log" Apr 16 14:34:04.604444 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:04.604413 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbfb44866-n8qrk_d700b8d8-e872-461f-98da-59ab8f1ffa2c/thanos-query/0.log" Apr 16 14:34:04.627226 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:04.627198 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbfb44866-n8qrk_d700b8d8-e872-461f-98da-59ab8f1ffa2c/kube-rbac-proxy-web/0.log" Apr 16 14:34:04.650297 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:04.650274 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbfb44866-n8qrk_d700b8d8-e872-461f-98da-59ab8f1ffa2c/kube-rbac-proxy/0.log" Apr 16 14:34:04.675799 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:04.675778 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbfb44866-n8qrk_d700b8d8-e872-461f-98da-59ab8f1ffa2c/prom-label-proxy/0.log" Apr 16 14:34:04.701459 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:04.701438 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbfb44866-n8qrk_d700b8d8-e872-461f-98da-59ab8f1ffa2c/kube-rbac-proxy-rules/0.log" Apr 16 14:34:04.725369 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:04.725347 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbfb44866-n8qrk_d700b8d8-e872-461f-98da-59ab8f1ffa2c/kube-rbac-proxy-metrics/0.log" Apr 16 14:34:06.360641 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:06.360605 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/2.log" Apr 16 14:34:06.364839 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:06.364817 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-7vwhw_9f6305f8-dd82-4db8-91e9-4ddbc887813b/console-operator/3.log" Apr 16 14:34:06.759706 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:06.759680 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6955878f8c-qjfhf_a1b04d97-2b54-4049-a24d-d229a2567619/console/0.log" Apr 16 14:34:07.234068 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.234033 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q"] Apr 16 14:34:07.234521 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.234503 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26d9defa-c326-4a98-9f31-47ce53226a65" containerName="main" Apr 16 14:34:07.234521 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.234522 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d9defa-c326-4a98-9f31-47ce53226a65" containerName="main" Apr 16 14:34:07.234657 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.234543 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26d9defa-c326-4a98-9f31-47ce53226a65" containerName="tokenizer" Apr 16 14:34:07.234657 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.234549 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d9defa-c326-4a98-9f31-47ce53226a65" containerName="tokenizer" Apr 16 14:34:07.234657 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.234563 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="506b4bbb-0a06-4109-b8d9-c11a5e887d40" containerName="main" Apr 16 14:34:07.234657 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.234571 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="506b4bbb-0a06-4109-b8d9-c11a5e887d40" containerName="main" Apr 16 14:34:07.234657 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.234577 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26d9defa-c326-4a98-9f31-47ce53226a65" containerName="storage-initializer" Apr 16 14:34:07.234657 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.234582 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d9defa-c326-4a98-9f31-47ce53226a65" containerName="storage-initializer" Apr 16 14:34:07.234657 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.234588 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="506b4bbb-0a06-4109-b8d9-c11a5e887d40" containerName="storage-initializer" Apr 16 14:34:07.234657 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.234596 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="506b4bbb-0a06-4109-b8d9-c11a5e887d40" containerName="storage-initializer" Apr 16 14:34:07.234657 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.234658 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="506b4bbb-0a06-4109-b8d9-c11a5e887d40" containerName="main" Apr 16 14:34:07.234930 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.234667 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="26d9defa-c326-4a98-9f31-47ce53226a65" containerName="main" Apr 16 14:34:07.234930 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.234674 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="26d9defa-c326-4a98-9f31-47ce53226a65" containerName="tokenizer" Apr 16 14:34:07.236828 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.236812 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.239417 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.239389 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kz48r\"/\"kube-root-ca.crt\"" Apr 16 14:34:07.239527 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.239395 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kz48r\"/\"openshift-service-ca.crt\"" Apr 16 14:34:07.240465 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.240446 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kz48r\"/\"default-dockercfg-spcjg\"" Apr 16 14:34:07.247685 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.247659 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q"] Apr 16 14:34:07.299636 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.299600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wlj5\" (UniqueName: \"kubernetes.io/projected/37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb-kube-api-access-6wlj5\") pod \"perf-node-gather-daemonset-zr56q\" (UID: \"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.299636 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.299637 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb-sys\") pod \"perf-node-gather-daemonset-zr56q\" (UID: \"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.299843 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.299672 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb-podres\") pod \"perf-node-gather-daemonset-zr56q\" (UID: \"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.299843 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.299737 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb-lib-modules\") pod \"perf-node-gather-daemonset-zr56q\" (UID: \"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.299843 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.299755 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb-proc\") pod \"perf-node-gather-daemonset-zr56q\" (UID: \"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.400998 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.400949 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wlj5\" (UniqueName: \"kubernetes.io/projected/37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb-kube-api-access-6wlj5\") pod \"perf-node-gather-daemonset-zr56q\" (UID: \"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.400998 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.400999 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb-sys\") pod \"perf-node-gather-daemonset-zr56q\" (UID: \"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.401496 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.401032 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb-podres\") pod \"perf-node-gather-daemonset-zr56q\" (UID: \"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.401496 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.401077 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb-lib-modules\") pod \"perf-node-gather-daemonset-zr56q\" (UID: \"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.401496 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.401144 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb-sys\") pod \"perf-node-gather-daemonset-zr56q\" (UID: \"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.401496 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.401145 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb-proc\") pod \"perf-node-gather-daemonset-zr56q\" (UID: \"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.401496 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.401169 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb-podres\") pod \"perf-node-gather-daemonset-zr56q\" (UID: \"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.401496 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.401200 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb-proc\") pod \"perf-node-gather-daemonset-zr56q\" (UID: \"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.401496 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.401239 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb-lib-modules\") pod \"perf-node-gather-daemonset-zr56q\" (UID: \"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.409443 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.409426 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wlj5\" (UniqueName: \"kubernetes.io/projected/37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb-kube-api-access-6wlj5\") pod \"perf-node-gather-daemonset-zr56q\" (UID: \"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.548171 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.548085 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:07.883142 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.883112 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q"] Apr 16 14:34:07.884680 ip-10-0-128-60 kubenswrapper[2569]: W0416 14:34:07.884651 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod37f938ff_7a0e_4eb2_8b8e_e1917b4cb8cb.slice/crio-f4ce797c9a3a16b9a3a5595acec8e8a33ef0e527d8e08339be701ccd14d5bf2d WatchSource:0}: Error finding container f4ce797c9a3a16b9a3a5595acec8e8a33ef0e527d8e08339be701ccd14d5bf2d: Status 404 returned error can't find the container with id f4ce797c9a3a16b9a3a5595acec8e8a33ef0e527d8e08339be701ccd14d5bf2d Apr 16 14:34:07.886212 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:07.886195 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:34:08.029071 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:08.029026 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xstwc_5fb3cee7-5cda-4d24-a176-260852fbda2c/dns/0.log" Apr 16 14:34:08.048458 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:08.048402 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xstwc_5fb3cee7-5cda-4d24-a176-260852fbda2c/kube-rbac-proxy/0.log" Apr 16 14:34:08.069429 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:08.069407 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4wsvh_b248dbae-841e-4eb7-a41e-cc738673d882/dns-node-resolver/0.log" Apr 16 14:34:08.456878 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:08.456837 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" event={"ID":"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb","Type":"ContainerStarted","Data":"22771b4106bd6749bbb24f3874133e3c95b869b48a84a381401640af91d67e02"} Apr 16 14:34:08.456878 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:08.456872 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" event={"ID":"37f938ff-7a0e-4eb2-8b8e-e1917b4cb8cb","Type":"ContainerStarted","Data":"f4ce797c9a3a16b9a3a5595acec8e8a33ef0e527d8e08339be701ccd14d5bf2d"} Apr 16 14:34:08.457329 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:08.456961 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:08.481812 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:08.481753 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" podStartSLOduration=1.4817329510000001 podStartE2EDuration="1.481732951s" podCreationTimestamp="2026-04-16 14:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:34:08.474013116 +0000 UTC m=+2083.482249388" watchObservedRunningTime="2026-04-16 14:34:08.481732951 +0000 UTC m=+2083.489969204" Apr 16 14:34:08.539440 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:08.539414 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7df7868c7f-q22rr_6ba3df13-9e23-42a3-86a2-4929bfedb89a/registry/0.log" Apr 16 14:34:08.602916 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:08.602889 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-w582s_d04a5bf9-7e36-4375-aad1-26af61c2c344/node-ca/0.log" Apr 16 14:34:09.499924 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:09.499896 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5b8555f84-mbcn4_e39140fd-a9c8-42bc-8af9-0ddd8cd8addc/router/0.log" Apr 16 14:34:09.950027 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:09.949997 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-q5pvc_7e8b38d3-0b12-4b9f-9c2c-d79c4a4aa393/serve-healthcheck-canary/0.log" Apr 16 14:34:10.567707 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:10.567679 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n22cn_8b7545ee-aad8-4cc3-876d-a3a9c72d72c9/kube-rbac-proxy/0.log" Apr 16 14:34:10.588671 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:10.588648 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n22cn_8b7545ee-aad8-4cc3-876d-a3a9c72d72c9/exporter/0.log" Apr 16 14:34:10.609098 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:10.609068 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n22cn_8b7545ee-aad8-4cc3-876d-a3a9c72d72c9/extractor/0.log" Apr 16 14:34:13.117034 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:13.116998 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-dc77c844c-d6p2z_afa03934-ff8c-4542-a83b-ae7567abef53/manager/0.log" Apr 16 14:34:13.707971 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:13.707943 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-6465f8dd5f-vllbv_44db5f20-7a7a-484e-85ff-c7808be0333b/manager/0.log" Apr 16 14:34:14.131743 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:14.131664 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-pslh8_fe2d0489-0a39-4cfb-bc54-091169ee40ff/s3-init/0.log" Apr 16 14:34:14.161112 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:14.161084 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-82k5c_fb082a84-67c7-4c81-84cd-a32737d79ddc/seaweedfs/0.log" Apr 16 14:34:14.470943 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:14.470918 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-zr56q" Apr 16 14:34:20.057084 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:20.057037 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7wc6j_ce8cf9fa-1ebb-4d35-ad03-c167ea484b2f/kube-multus/0.log" Apr 16 14:34:20.397937 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:20.397862 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljdss_88319ece-75ee-4ddb-b42a-2a26963cba92/kube-multus-additional-cni-plugins/0.log" Apr 16 14:34:20.420075 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:20.420048 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljdss_88319ece-75ee-4ddb-b42a-2a26963cba92/egress-router-binary-copy/0.log" Apr 16 14:34:20.441115 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:20.441087 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljdss_88319ece-75ee-4ddb-b42a-2a26963cba92/cni-plugins/0.log" Apr 16 14:34:20.462857 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:20.462835 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljdss_88319ece-75ee-4ddb-b42a-2a26963cba92/bond-cni-plugin/0.log" Apr 16 14:34:20.484359 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:20.484325 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljdss_88319ece-75ee-4ddb-b42a-2a26963cba92/routeoverride-cni/0.log" Apr 16 14:34:20.507525 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:20.507497 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljdss_88319ece-75ee-4ddb-b42a-2a26963cba92/whereabouts-cni-bincopy/0.log" Apr 16 14:34:20.531051 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:20.531015 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljdss_88319ece-75ee-4ddb-b42a-2a26963cba92/whereabouts-cni/0.log" Apr 16 14:34:20.704103 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:20.704071 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mkz26_c449dabf-b9f5-4136-b598-074040f02629/network-metrics-daemon/0.log" Apr 16 14:34:20.725342 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:20.725313 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mkz26_c449dabf-b9f5-4136-b598-074040f02629/kube-rbac-proxy/0.log" Apr 16 14:34:21.495000 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:21.494961 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jdtzv_6dcff5a1-e62a-4c95-9278-292e6b914e02/ovn-controller/0.log" Apr 16 14:34:21.519437 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:21.519406 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jdtzv_6dcff5a1-e62a-4c95-9278-292e6b914e02/ovn-acl-logging/0.log" Apr 16 14:34:21.536623 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:21.536599 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jdtzv_6dcff5a1-e62a-4c95-9278-292e6b914e02/kube-rbac-proxy-node/0.log" Apr 16 14:34:21.575935 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:21.575906 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jdtzv_6dcff5a1-e62a-4c95-9278-292e6b914e02/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 14:34:21.621989 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:21.621958 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jdtzv_6dcff5a1-e62a-4c95-9278-292e6b914e02/northd/0.log" Apr 16 14:34:21.682676 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:21.682648 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jdtzv_6dcff5a1-e62a-4c95-9278-292e6b914e02/nbdb/0.log" Apr 16 14:34:21.707183 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:21.707159 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jdtzv_6dcff5a1-e62a-4c95-9278-292e6b914e02/sbdb/0.log" Apr 16 14:34:21.820737 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:21.820653 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jdtzv_6dcff5a1-e62a-4c95-9278-292e6b914e02/ovnkube-controller/0.log" Apr 16 14:34:23.512041 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:23.512011 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-mhlgp_9afaf590-634d-44c9-9149-a169bbbc6320/check-endpoints/0.log" Apr 16 14:34:23.584336 ip-10-0-128-60 kubenswrapper[2569]: I0416 14:34:23.584306 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ws5bp_e4325a5a-3a6c-429b-a7f3-5a19918e6fd0/network-check-target-container/0.log"