Apr 23 17:52:09.628168 ip-10-0-132-102 systemd[1]: Starting Kubernetes Kubelet... Apr 23 17:52:10.138769 ip-10-0-132-102 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:52:10.138769 ip-10-0-132-102 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 17:52:10.138769 ip-10-0-132-102 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:52:10.138769 ip-10-0-132-102 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 17:52:10.138769 ip-10-0-132-102 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:52:10.139400 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.138821 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 17:52:10.146106 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146082 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:52:10.146106 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146099 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:52:10.146106 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146103 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:52:10.146106 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146107 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:52:10.146106 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146110 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:52:10.146106 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146113 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146117 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146120 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146123 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146125 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146128 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146131 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146133 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146136 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146139 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146141 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146144 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146147 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146149 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146152 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146155 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146157 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146160 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146163 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146167 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:52:10.146323 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146170 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146173 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146176 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146179 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146181 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146184 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146187 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146189 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146192 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146194 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146197 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146199 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146202 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146204 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146208 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146211 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146214 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146216 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146219 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:52:10.146850 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146223 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146227 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146230 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146234 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146237 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146239 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146242 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146245 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146248 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146251 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146253 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146256 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146259 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146261 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146264 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146266 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146269 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146271 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146274 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146276 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:52:10.147359 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146279 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146282 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146284 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146287 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146289 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146292 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146294 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146298 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146301 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146304 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146306 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146309 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146312 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146315 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146319 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146322 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146325 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146328 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146330 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146333 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:52:10.147865 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146336 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146339 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146752 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146758 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146761 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146764 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146769 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146773 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146776 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146779 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146782 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146785 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146788 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146791 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146794 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146796 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146799 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146801 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146804 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146807 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:52:10.148349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146809 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146812 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146814 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146817 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146820 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146822 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146825 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146828 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146830 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146833 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146835 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146838 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146840 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146843 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146845 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146848 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146850 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146853 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146856 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146858 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:52:10.148844 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146861 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146863 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146866 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146869 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146873 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146875 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146878 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146880 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146883 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146886 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146888 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146891 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146894 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146896 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146899 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146901 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146904 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146906 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146909 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146912 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:52:10.149364 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146914 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146917 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146919 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146922 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146924 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146927 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146935 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146939 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146943 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146946 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146949 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146951 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146954 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146958 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146960 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146963 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146966 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146968 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146971 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:52:10.149890 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146973 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146976 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146979 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146982 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146984 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146987 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146990 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146992 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.146995 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147616 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147630 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147637 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147641 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147646 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147650 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147654 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147659 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147662 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147665 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147669 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147672 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147675 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147678 2576 flags.go:64] FLAG: --cgroup-root="" Apr 23 17:52:10.150358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147681 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147684 2576 flags.go:64] FLAG: --client-ca-file="" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147687 2576 flags.go:64] FLAG: --cloud-config="" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147689 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147692 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147696 2576 flags.go:64] FLAG: --cluster-domain="" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147699 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147702 2576 flags.go:64] FLAG: --config-dir="" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147705 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147709 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147713 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147716 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147719 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147723 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147726 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147729 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147731 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147734 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147753 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147758 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147761 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147764 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147767 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147772 2576 flags.go:64] FLAG: --enable-server="true" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147775 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 17:52:10.150932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147780 2576 flags.go:64] FLAG: --event-burst="100" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147783 2576 flags.go:64] FLAG: --event-qps="50" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147786 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147789 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147792 2576 flags.go:64] FLAG: --eviction-hard="" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147796 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147799 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147801 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147804 2576 flags.go:64] FLAG: --eviction-soft="" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147808 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147810 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147813 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147816 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147819 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147822 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147824 2576 flags.go:64] FLAG: --feature-gates="" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147828 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147831 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147835 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147838 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147841 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147844 2576 flags.go:64] FLAG: --help="false" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147847 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147850 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 17:52:10.151543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147853 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147856 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147860 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147863 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147866 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147869 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147872 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147876 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147879 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147882 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147885 2576 flags.go:64] FLAG: --kube-reserved="" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147888 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147891 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147894 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147897 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147900 2576 flags.go:64] FLAG: --lock-file="" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147903 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147906 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147909 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147914 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147917 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147920 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147923 2576 flags.go:64] FLAG: --logging-format="text" Apr 23 17:52:10.152125 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147926 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147929 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147932 2576 flags.go:64] FLAG: --manifest-url="" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147935 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147939 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147942 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147946 2576 flags.go:64] FLAG: --max-pods="110" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147949 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147952 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147955 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147958 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147961 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147964 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147967 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147974 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147978 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147981 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147984 2576 flags.go:64] FLAG: --pod-cidr="" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147987 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147993 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.147996 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148000 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148002 2576 flags.go:64] FLAG: --port="10250" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148005 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 17:52:10.152687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148008 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0440715597508edbb" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148011 2576 flags.go:64] FLAG: --qos-reserved="" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148014 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148017 2576 flags.go:64] FLAG: --register-node="true" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148020 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148023 2576 flags.go:64] FLAG: --register-with-taints="" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148027 2576 flags.go:64] FLAG: --registry-burst="10" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148030 2576 flags.go:64] FLAG: --registry-qps="5" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148033 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148036 2576 flags.go:64] FLAG: --reserved-memory="" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148040 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148043 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148046 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148048 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148051 2576 flags.go:64] FLAG: --runonce="false" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148055 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148058 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148061 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148063 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148066 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148069 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148072 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148077 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148080 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148083 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148086 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 17:52:10.153293 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148089 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148093 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148095 2576 flags.go:64] FLAG: --system-cgroups="" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148098 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148104 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148109 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148112 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148116 2576 flags.go:64] FLAG: --tls-min-version="" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148119 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148122 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148125 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148128 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148131 2576 flags.go:64] FLAG: --v="2" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148135 2576 flags.go:64] FLAG: --version="false" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148139 2576 flags.go:64] FLAG: --vmodule="" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148144 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.148147 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148238 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148241 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148244 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148247 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148250 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148254 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:52:10.153995 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148258 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148260 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148263 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148265 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148268 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148275 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148278 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148280 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148283 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148285 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148288 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148291 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148294 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148296 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148300 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148303 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148306 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148308 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148311 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148313 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:52:10.154623 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148316 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148318 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148321 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148323 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148326 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148329 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148331 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148333 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148336 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148339 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148341 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148343 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148346 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148349 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148351 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148354 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148356 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148360 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148363 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148365 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:52:10.155196 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148368 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148370 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148373 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148376 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148379 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148383 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148387 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148390 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148393 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148395 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148398 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148401 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148403 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148406 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148408 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148411 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148413 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148415 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148418 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148420 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:52:10.155753 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148423 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148426 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148428 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148430 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148433 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148435 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148438 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148440 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148443 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148446 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148449 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148451 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148454 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148456 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148458 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148461 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148464 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148466 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148470 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:52:10.156279 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.148472 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:52:10.156755 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.150373 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:52:10.157857 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.157732 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 17:52:10.157896 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.157858 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 17:52:10.157929 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157910 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:52:10.157929 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157915 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:52:10.157929 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157919 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:52:10.157929 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157922 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:52:10.157929 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157925 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:52:10.157929 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157930 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157934 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157938 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157940 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157944 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157947 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157950 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157952 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157955 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157958 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157960 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157963 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157965 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157969 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157972 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157974 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157977 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157980 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157983 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157985 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:52:10.158082 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157988 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157990 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157993 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157995 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.157998 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158000 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158004 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158007 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158010 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158012 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158015 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158018 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158020 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158023 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158026 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158029 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158031 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158034 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158036 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158039 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:52:10.158564 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158041 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158044 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158046 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158049 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158052 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158054 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158057 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158060 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158063 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158065 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158068 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158071 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158073 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158076 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158078 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158081 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158083 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158086 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158089 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:52:10.159086 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158092 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158095 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158097 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158100 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158102 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158105 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158108 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158111 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158113 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158116 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158119 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158122 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158124 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158128 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158132 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158135 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158138 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158141 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158144 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158147 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:52:10.159577 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158150 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:52:10.160077 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158152 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:52:10.160077 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.158158 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:52:10.160077 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158256 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:52:10.160077 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158261 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:52:10.160077 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158264 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:52:10.160077 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158267 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:52:10.160077 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158270 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:52:10.160077 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158272 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:52:10.160077 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158275 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:52:10.160077 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158277 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:52:10.160077 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158280 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:52:10.160077 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158282 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:52:10.160077 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158285 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:52:10.160077 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158288 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:52:10.160077 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158290 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158293 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158296 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158298 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158302 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158304 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158307 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158310 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158313 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158315 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158318 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158320 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158323 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158326 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158328 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158331 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158333 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158336 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158339 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158341 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:52:10.160437 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158344 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158346 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158348 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158351 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158354 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158357 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158360 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158363 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158366 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158369 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158371 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158375 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158377 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158380 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158382 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158385 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158388 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158391 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158394 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158396 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:52:10.160932 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158399 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158401 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158404 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158407 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158409 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158412 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158414 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158417 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158419 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158422 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158424 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158426 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158429 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158431 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158434 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158436 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158439 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158441 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158444 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158447 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:52:10.161426 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158449 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:52:10.161937 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158452 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:52:10.161937 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158455 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:52:10.161937 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158460 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:52:10.161937 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158463 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:52:10.161937 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158466 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:52:10.161937 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158468 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:52:10.161937 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158471 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:52:10.161937 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158474 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:52:10.161937 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158477 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:52:10.161937 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158480 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:52:10.161937 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158482 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:52:10.161937 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158484 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:52:10.161937 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:10.158487 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:52:10.161937 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.158492 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:52:10.161937 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.158599 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 17:52:10.162308 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.161012 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 17:52:10.162950 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.162938 2576 server.go:1019] "Starting client certificate rotation" Apr 23 17:52:10.163055 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.163037 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:52:10.163104 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.163085 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:52:10.191138 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.191115 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:52:10.195207 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.195184 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:52:10.211789 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.211768 2576 log.go:25] "Validated CRI v1 runtime API" Apr 23 17:52:10.217485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.217466 2576 log.go:25] "Validated CRI v1 image API" Apr 23 17:52:10.218887 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.218872 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 17:52:10.223251 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.223232 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:52:10.224588 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.224570 2576 fs.go:135] Filesystem UUIDs: map[179363e9-9e41-4265-a8b9-e23472b002ad:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 dfa52e28-27c5-4a09-8aee-1fcee1b8ab16:/dev/nvme0n1p4] Apr 23 17:52:10.224638 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.224590 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 17:52:10.230155 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.230054 2576 manager.go:217] Machine: {Timestamp:2026-04-23 17:52:10.228277515 +0000 UTC m=+0.463724209 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3090670 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f3dd9022b63c7af67b3afa429222b SystemUUID:ec2f3dd9-022b-63c7-af67-b3afa429222b BootID:5f6ce687-0527-4b5a-aebb-1964c49d8edb Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:94:7f:cf:83:11 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:94:7f:cf:83:11 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:f5:95:e4:41:e5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 17:52:10.230155 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.230150 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 17:52:10.230301 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.230279 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 17:52:10.230597 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.230574 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 17:52:10.230733 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.230598 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-102.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 17:52:10.230797 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.230758 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 17:52:10.230797 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.230767 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 17:52:10.230797 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.230780 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:52:10.231665 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.231655 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:52:10.232453 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.232442 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:52:10.232563 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.232553 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 17:52:10.234852 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.234843 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 23 17:52:10.234891 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.234861 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 17:52:10.234891 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.234873 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 17:52:10.234891 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.234882 2576 kubelet.go:397] "Adding apiserver pod source" Apr 23 17:52:10.234891 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.234891 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 17:52:10.236028 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.236017 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:52:10.236071 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.236043 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:52:10.239505 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.239485 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 17:52:10.240861 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.240848 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 17:52:10.242794 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.242778 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 17:52:10.242857 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.242798 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 17:52:10.242857 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.242805 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 17:52:10.242857 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.242810 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 17:52:10.242857 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.242816 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 17:52:10.242857 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.242822 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 17:52:10.242857 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.242828 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 17:52:10.242857 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.242833 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 17:52:10.242857 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.242841 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 17:52:10.242857 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.242847 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 17:52:10.242857 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.242856 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 17:52:10.243112 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.242865 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 17:52:10.244401 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.244391 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 17:52:10.244401 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.244401 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 17:52:10.248162 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.248147 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 17:52:10.248240 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.248189 2576 server.go:1295] "Started kubelet" Apr 23 17:52:10.248395 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.248348 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-102.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:52:10.248433 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.248333 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 17:52:10.248461 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.248447 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 17:52:10.248461 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.248443 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-102.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:10.248516 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.248488 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:52:10.248545 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.248529 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 17:52:10.249061 ip-10-0-132-102 systemd[1]: Started Kubernetes Kubelet. Apr 23 17:52:10.249712 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.249697 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 17:52:10.253958 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.253938 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 23 17:52:10.258416 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.256273 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-102.ec2.internal.18a90dd38e210ead default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-102.ec2.internal,UID:ip-10-0-132-102.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-102.ec2.internal,},FirstTimestamp:2026-04-23 17:52:10.248162989 +0000 UTC m=+0.483609686,LastTimestamp:2026-04-23 17:52:10.248162989 +0000 UTC m=+0.483609686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-102.ec2.internal,}" Apr 23 17:52:10.258416 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.258414 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 17:52:10.258947 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.258885 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 17:52:10.259887 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.259848 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 17:52:10.259887 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.259871 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 17:52:10.260017 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.260012 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 17:52:10.260061 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.260054 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 23 17:52:10.260100 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.260061 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 23 17:52:10.260173 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.260156 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 17:52:10.260207 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.260180 2576 factory.go:55] Registering systemd factory Apr 23 17:52:10.260207 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.260195 2576 factory.go:223] Registration of the systemd container factory successfully Apr 23 17:52:10.260277 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.260176 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-102.ec2.internal\" not found" Apr 23 17:52:10.260484 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.260432 2576 factory.go:153] Registering CRI-O factory Apr 23 17:52:10.260484 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.260476 2576 factory.go:223] Registration of the crio container factory successfully Apr 23 17:52:10.260576 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.260501 2576 factory.go:103] Registering Raw factory Apr 23 17:52:10.260576 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.260561 2576 manager.go:1196] Started watching for new ooms in manager Apr 23 17:52:10.261213 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.261185 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 17:52:10.261294 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.261277 2576 manager.go:319] Starting recovery of all containers Apr 23 17:52:10.267670 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.267627 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 17:52:10.267884 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.267854 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 17:52:10.267992 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.267961 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-102.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 17:52:10.271378 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.271362 2576 manager.go:324] Recovery completed Apr 23 17:52:10.275659 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.275645 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:10.276836 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.276817 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mq786" Apr 23 17:52:10.280485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.280469 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:10.280559 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.280497 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:10.280559 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.280508 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:10.281022 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.281005 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 17:52:10.281022 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.281019 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 17:52:10.281134 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.281033 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:52:10.282024 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.281955 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-102.ec2.internal.18a90dd3900e3e37 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-102.ec2.internal,UID:ip-10-0-132-102.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-132-102.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-132-102.ec2.internal,},FirstTimestamp:2026-04-23 17:52:10.280484407 +0000 UTC m=+0.515931101,LastTimestamp:2026-04-23 17:52:10.280484407 +0000 UTC m=+0.515931101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-102.ec2.internal,}" Apr 23 17:52:10.283122 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.283111 2576 policy_none.go:49] "None policy: Start" Apr 23 17:52:10.283183 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.283126 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 17:52:10.283183 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.283137 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 23 17:52:10.284208 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.284191 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mq786" Apr 23 17:52:10.327524 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.320929 2576 manager.go:341] "Starting Device Plugin manager" Apr 23 17:52:10.327524 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.320959 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 17:52:10.327524 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.320973 2576 server.go:85] "Starting device plugin registration server" Apr 23 17:52:10.327524 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.321249 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 17:52:10.327524 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.321265 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 17:52:10.327524 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.321411 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 17:52:10.327524 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.321493 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 17:52:10.327524 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.321502 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 17:52:10.327524 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.322328 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 17:52:10.327524 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.322361 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-102.ec2.internal\" not found" Apr 23 17:52:10.388998 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.388928 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 17:52:10.388998 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.388962 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 17:52:10.388998 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.388980 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 17:52:10.388998 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.388988 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 17:52:10.389211 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.389069 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 17:52:10.392549 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.392527 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:52:10.421804 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.421785 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:10.422656 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.422640 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:10.422737 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.422667 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:10.422737 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.422678 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:10.422737 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.422700 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.432183 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.432161 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.432240 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.432187 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-102.ec2.internal\": node \"ip-10-0-132-102.ec2.internal\" not found" Apr 23 17:52:10.449629 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.449606 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-102.ec2.internal\" not found" Apr 23 17:52:10.489710 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.489678 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-102.ec2.internal"] Apr 23 17:52:10.489808 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.489778 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:10.490702 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.490684 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:10.490794 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.490715 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:10.490794 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.490727 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:10.491868 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.491856 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:10.492022 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.492006 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.492057 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.492038 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:10.492802 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.492788 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:10.492870 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.492802 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:10.492870 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.492815 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:10.492870 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.492824 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:10.492870 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.492831 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:10.492870 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.492839 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:10.493898 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.493883 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.493948 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.493915 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:10.494578 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.494562 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:10.494648 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.494590 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:10.494648 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.494602 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:10.510304 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.510283 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-102.ec2.internal\" not found" node="ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.514575 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.514562 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-102.ec2.internal\" not found" node="ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.549811 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.549783 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-102.ec2.internal\" not found" Apr 23 17:52:10.650654 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.650569 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-102.ec2.internal\" not found" Apr 23 17:52:10.661967 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.661938 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13a2aab92beaa8cd38c68b02321633e1-config\") pod \"kube-apiserver-proxy-ip-10-0-132-102.ec2.internal\" (UID: \"13a2aab92beaa8cd38c68b02321633e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.662079 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.661975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1a97617bff83f0fe8f7c02c491ea0ab2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal\" (UID: \"1a97617bff83f0fe8f7c02c491ea0ab2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.662079 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.662004 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a97617bff83f0fe8f7c02c491ea0ab2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal\" (UID: \"1a97617bff83f0fe8f7c02c491ea0ab2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.751324 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.751270 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-102.ec2.internal\" not found" Apr 23 17:52:10.762699 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.762676 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1a97617bff83f0fe8f7c02c491ea0ab2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal\" (UID: \"1a97617bff83f0fe8f7c02c491ea0ab2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.762797 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.762712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a97617bff83f0fe8f7c02c491ea0ab2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal\" (UID: \"1a97617bff83f0fe8f7c02c491ea0ab2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.762797 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.762734 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13a2aab92beaa8cd38c68b02321633e1-config\") pod \"kube-apiserver-proxy-ip-10-0-132-102.ec2.internal\" (UID: \"13a2aab92beaa8cd38c68b02321633e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.762879 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.762808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1a97617bff83f0fe8f7c02c491ea0ab2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal\" (UID: \"1a97617bff83f0fe8f7c02c491ea0ab2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.762920 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.762875 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a97617bff83f0fe8f7c02c491ea0ab2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal\" (UID: \"1a97617bff83f0fe8f7c02c491ea0ab2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.762920 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.762912 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13a2aab92beaa8cd38c68b02321633e1-config\") pod \"kube-apiserver-proxy-ip-10-0-132-102.ec2.internal\" (UID: \"13a2aab92beaa8cd38c68b02321633e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.812842 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.812807 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.817516 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:10.817498 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-102.ec2.internal" Apr 23 17:52:10.852369 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.852330 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-102.ec2.internal\" not found" Apr 23 17:52:10.952962 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:10.952852 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-102.ec2.internal\" not found" Apr 23 17:52:11.053429 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:11.053396 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-102.ec2.internal\" not found" Apr 23 17:52:11.154030 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:11.153990 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-102.ec2.internal\" not found" Apr 23 17:52:11.162316 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.162293 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 17:52:11.162446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.162430 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:52:11.213351 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.213270 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:52:11.235832 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.235804 2576 apiserver.go:52] "Watching apiserver" Apr 23 17:52:11.242143 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.242112 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 17:52:11.243198 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.243176 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-bkrt6","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt","openshift-cluster-node-tuning-operator/tuned-z6prg","openshift-image-registry/node-ca-9pnhp"] Apr 23 17:52:11.245597 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.245575 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bkrt6" Apr 23 17:52:11.246672 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.246649 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.246774 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.246692 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.247439 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.247422 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 17:52:11.247530 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.247487 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 17:52:11.247691 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.247666 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9pnhp" Apr 23 17:52:11.247907 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.247887 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-vjs8g\"" Apr 23 17:52:11.248442 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.248423 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gzl8x\"" Apr 23 17:52:11.248700 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.248684 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 17:52:11.248788 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.248684 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bq2xd\"" Apr 23 17:52:11.249148 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.249131 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 17:52:11.249189 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.249147 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 17:52:11.249189 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.249162 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 17:52:11.249256 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.249234 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:52:11.249567 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.249546 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 17:52:11.249658 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.249571 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 17:52:11.249658 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.249604 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 17:52:11.249658 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.249634 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4nmd5\"" Apr 23 17:52:11.258951 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.258926 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 17:52:11.260428 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.260410 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal" Apr 23 17:52:11.260677 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.260663 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 17:52:11.265751 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.265725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/712ef82b-3fe4-488d-9956-2e0264016fa7-konnectivity-ca\") pod \"konnectivity-agent-bkrt6\" (UID: \"712ef82b-3fe4-488d-9956-2e0264016fa7\") " pod="kube-system/konnectivity-agent-bkrt6" Apr 23 17:52:11.265823 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.265768 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.265823 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.265787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-sysconfig\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.265823 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.265803 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-host\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.265939 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.265831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j72hf\" (UniqueName: \"kubernetes.io/projected/82216d67-3ae3-4fd5-be5c-85a939836d44-kube-api-access-j72hf\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.265939 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.265867 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k9t6\" (UniqueName: \"kubernetes.io/projected/4c608978-9ca3-4730-81a8-ed012e4601c4-kube-api-access-5k9t6\") pod \"node-ca-9pnhp\" (UID: \"4c608978-9ca3-4730-81a8-ed012e4601c4\") " pod="openshift-image-registry/node-ca-9pnhp" Apr 23 17:52:11.265939 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.265883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv4nk\" (UniqueName: \"kubernetes.io/projected/40b803bf-2eb9-4d1b-aaf1-45331a946c46-kube-api-access-dv4nk\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.265939 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.265924 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-kubernetes\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.266095 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.265956 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-etc-selinux\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.266095 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.265972 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-var-lib-kubelet\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.266095 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.265992 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-tuned\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.266095 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.266013 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-socket-dir\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.266095 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.266029 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-systemd\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.266095 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.266044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82216d67-3ae3-4fd5-be5c-85a939836d44-tmp\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.266095 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.266063 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-modprobe-d\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.266095 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.266088 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-sys\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.266435 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.266106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-lib-modules\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.266435 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.266122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c608978-9ca3-4730-81a8-ed012e4601c4-host\") pod \"node-ca-9pnhp\" (UID: \"4c608978-9ca3-4730-81a8-ed012e4601c4\") " pod="openshift-image-registry/node-ca-9pnhp" Apr 23 17:52:11.266435 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.266143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4c608978-9ca3-4730-81a8-ed012e4601c4-serviceca\") pod \"node-ca-9pnhp\" (UID: \"4c608978-9ca3-4730-81a8-ed012e4601c4\") " pod="openshift-image-registry/node-ca-9pnhp" Apr 23 17:52:11.266435 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.266159 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-sys-fs\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.266435 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.266181 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-sysctl-conf\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.266435 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.266221 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-run\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.266435 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.266262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/712ef82b-3fe4-488d-9956-2e0264016fa7-agent-certs\") pod \"konnectivity-agent-bkrt6\" (UID: \"712ef82b-3fe4-488d-9956-2e0264016fa7\") " pod="kube-system/konnectivity-agent-bkrt6" Apr 23 17:52:11.266435 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.266287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-registration-dir\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.266435 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.266312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-device-dir\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.266435 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.266334 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-sysctl-d\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.267292 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.267274 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:52:11.274847 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.274827 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:52:11.274935 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.274899 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-102.ec2.internal" Apr 23 17:52:11.275000 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.274986 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal"] Apr 23 17:52:11.283090 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.282952 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-132-102.ec2.internal"] Apr 23 17:52:11.283320 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.283303 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:52:11.286459 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.286432 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 17:47:10 +0000 UTC" deadline="2027-12-12 18:15:09.842679992 +0000 UTC" Apr 23 17:52:11.286459 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.286456 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14352h22m58.556225791s" Apr 23 17:52:11.292569 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.292551 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-nzvc2" Apr 23 17:52:11.297298 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:11.297275 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13a2aab92beaa8cd38c68b02321633e1.slice/crio-7dc06e6e9c9714796b7a6068359de7b1b2b5f61bdccb56e5fb0b64a4d71807ca WatchSource:0}: Error finding container 7dc06e6e9c9714796b7a6068359de7b1b2b5f61bdccb56e5fb0b64a4d71807ca: Status 404 returned error can't find the container with id 7dc06e6e9c9714796b7a6068359de7b1b2b5f61bdccb56e5fb0b64a4d71807ca Apr 23 17:52:11.297517 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:11.297497 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a97617bff83f0fe8f7c02c491ea0ab2.slice/crio-59957673a14c3c24300c72f304a1003031499a61a4e6b8e0b5de7117e810da05 WatchSource:0}: Error finding container 59957673a14c3c24300c72f304a1003031499a61a4e6b8e0b5de7117e810da05: Status 404 returned error can't find the container with id 59957673a14c3c24300c72f304a1003031499a61a4e6b8e0b5de7117e810da05 Apr 23 17:52:11.300871 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.300849 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-nzvc2" Apr 23 17:52:11.302955 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.302935 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:52:11.367129 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82216d67-3ae3-4fd5-be5c-85a939836d44-tmp\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.367129 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367126 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-modprobe-d\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-sys\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-lib-modules\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c608978-9ca3-4730-81a8-ed012e4601c4-host\") pod \"node-ca-9pnhp\" (UID: \"4c608978-9ca3-4730-81a8-ed012e4601c4\") " pod="openshift-image-registry/node-ca-9pnhp" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4c608978-9ca3-4730-81a8-ed012e4601c4-serviceca\") pod \"node-ca-9pnhp\" (UID: \"4c608978-9ca3-4730-81a8-ed012e4601c4\") " pod="openshift-image-registry/node-ca-9pnhp" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-sys-fs\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-sysctl-conf\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-run\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c608978-9ca3-4730-81a8-ed012e4601c4-host\") pod \"node-ca-9pnhp\" (UID: \"4c608978-9ca3-4730-81a8-ed012e4601c4\") " pod="openshift-image-registry/node-ca-9pnhp" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/712ef82b-3fe4-488d-9956-2e0264016fa7-agent-certs\") pod \"konnectivity-agent-bkrt6\" (UID: \"712ef82b-3fe4-488d-9956-2e0264016fa7\") " pod="kube-system/konnectivity-agent-bkrt6" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367251 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-sys\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-registration-dir\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-modprobe-d\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367299 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-sys-fs\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-run\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367346 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-lib-modules\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367346 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-registration-dir\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.367361 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-sysctl-conf\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-device-dir\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-device-dir\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-sysctl-d\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/712ef82b-3fe4-488d-9956-2e0264016fa7-konnectivity-ca\") pod \"konnectivity-agent-bkrt6\" (UID: \"712ef82b-3fe4-488d-9956-2e0264016fa7\") " pod="kube-system/konnectivity-agent-bkrt6" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-sysconfig\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-host\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-sysctl-d\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367507 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j72hf\" (UniqueName: \"kubernetes.io/projected/82216d67-3ae3-4fd5-be5c-85a939836d44-kube-api-access-j72hf\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5k9t6\" (UniqueName: \"kubernetes.io/projected/4c608978-9ca3-4730-81a8-ed012e4601c4-kube-api-access-5k9t6\") pod \"node-ca-9pnhp\" (UID: \"4c608978-9ca3-4730-81a8-ed012e4601c4\") " pod="openshift-image-registry/node-ca-9pnhp" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367566 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-host\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367560 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-sysconfig\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dv4nk\" (UniqueName: \"kubernetes.io/projected/40b803bf-2eb9-4d1b-aaf1-45331a946c46-kube-api-access-dv4nk\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367622 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-kubernetes\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367634 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 17:52:11.368133 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-etc-selinux\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.368631 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-var-lib-kubelet\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.368631 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-tuned\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.368631 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4c608978-9ca3-4730-81a8-ed012e4601c4-serviceca\") pod \"node-ca-9pnhp\" (UID: \"4c608978-9ca3-4730-81a8-ed012e4601c4\") " pod="openshift-image-registry/node-ca-9pnhp" Apr 23 17:52:11.368631 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367719 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-socket-dir\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.368631 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-systemd\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.368631 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367775 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-var-lib-kubelet\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.368631 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-kubernetes\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.368631 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-systemd\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.368631 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-socket-dir\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.368631 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.367954 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/40b803bf-2eb9-4d1b-aaf1-45331a946c46-etc-selinux\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.368631 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.368031 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/712ef82b-3fe4-488d-9956-2e0264016fa7-konnectivity-ca\") pod \"konnectivity-agent-bkrt6\" (UID: \"712ef82b-3fe4-488d-9956-2e0264016fa7\") " pod="kube-system/konnectivity-agent-bkrt6" Apr 23 17:52:11.370607 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.370576 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/82216d67-3ae3-4fd5-be5c-85a939836d44-etc-tuned\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.370704 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.370621 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82216d67-3ae3-4fd5-be5c-85a939836d44-tmp\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.370869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.370854 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/712ef82b-3fe4-488d-9956-2e0264016fa7-agent-certs\") pod \"konnectivity-agent-bkrt6\" (UID: \"712ef82b-3fe4-488d-9956-2e0264016fa7\") " pod="kube-system/konnectivity-agent-bkrt6" Apr 23 17:52:11.376261 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.376230 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k9t6\" (UniqueName: \"kubernetes.io/projected/4c608978-9ca3-4730-81a8-ed012e4601c4-kube-api-access-5k9t6\") pod \"node-ca-9pnhp\" (UID: \"4c608978-9ca3-4730-81a8-ed012e4601c4\") " pod="openshift-image-registry/node-ca-9pnhp" Apr 23 17:52:11.376356 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.376345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv4nk\" (UniqueName: \"kubernetes.io/projected/40b803bf-2eb9-4d1b-aaf1-45331a946c46-kube-api-access-dv4nk\") pod \"aws-ebs-csi-driver-node-f5jrt\" (UID: \"40b803bf-2eb9-4d1b-aaf1-45331a946c46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.376398 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.376386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j72hf\" (UniqueName: \"kubernetes.io/projected/82216d67-3ae3-4fd5-be5c-85a939836d44-kube-api-access-j72hf\") pod \"tuned-z6prg\" (UID: \"82216d67-3ae3-4fd5-be5c-85a939836d44\") " pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.392021 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.391972 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-102.ec2.internal" event={"ID":"13a2aab92beaa8cd38c68b02321633e1","Type":"ContainerStarted","Data":"7dc06e6e9c9714796b7a6068359de7b1b2b5f61bdccb56e5fb0b64a4d71807ca"} Apr 23 17:52:11.392856 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.392837 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal" event={"ID":"1a97617bff83f0fe8f7c02c491ea0ab2","Type":"ContainerStarted","Data":"59957673a14c3c24300c72f304a1003031499a61a4e6b8e0b5de7117e810da05"} Apr 23 17:52:11.574469 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.574371 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bkrt6" Apr 23 17:52:11.580105 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.580081 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" Apr 23 17:52:11.581677 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:11.581648 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod712ef82b_3fe4_488d_9956_2e0264016fa7.slice/crio-fa43c5f8be6f02fb95d15ee947fcaacbe60212a4fa1208a6b1dd19684068b6cc WatchSource:0}: Error finding container fa43c5f8be6f02fb95d15ee947fcaacbe60212a4fa1208a6b1dd19684068b6cc: Status 404 returned error can't find the container with id fa43c5f8be6f02fb95d15ee947fcaacbe60212a4fa1208a6b1dd19684068b6cc Apr 23 17:52:11.586671 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:11.586647 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40b803bf_2eb9_4d1b_aaf1_45331a946c46.slice/crio-7141758a54788d202a7e7139774f59271acf8146011f405006d007c812fec885 WatchSource:0}: Error finding container 7141758a54788d202a7e7139774f59271acf8146011f405006d007c812fec885: Status 404 returned error can't find the container with id 7141758a54788d202a7e7139774f59271acf8146011f405006d007c812fec885 Apr 23 17:52:11.606340 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.606307 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z6prg" Apr 23 17:52:11.609908 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.609871 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9pnhp" Apr 23 17:52:11.614471 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:11.614444 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82216d67_3ae3_4fd5_be5c_85a939836d44.slice/crio-e8231a3555cbcf032aa68447c91327eabe2ae06f34578516129561fc5e9b5d0c WatchSource:0}: Error finding container e8231a3555cbcf032aa68447c91327eabe2ae06f34578516129561fc5e9b5d0c: Status 404 returned error can't find the container with id e8231a3555cbcf032aa68447c91327eabe2ae06f34578516129561fc5e9b5d0c Apr 23 17:52:11.617554 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:52:11.617528 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c608978_9ca3_4730_81a8_ed012e4601c4.slice/crio-a1f02280a10284e8b442cf201f9a566b0d77d375a2da4eda2aed9cacc365c465 WatchSource:0}: Error finding container a1f02280a10284e8b442cf201f9a566b0d77d375a2da4eda2aed9cacc365c465: Status 404 returned error can't find the container with id a1f02280a10284e8b442cf201f9a566b0d77d375a2da4eda2aed9cacc365c465 Apr 23 17:52:11.831898 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.831805 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:52:11.832090 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.832047 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:52:11.986849 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:11.986807 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:52:12.301729 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:12.301640 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:47:11 +0000 UTC" deadline="2028-01-12 23:01:26.55825921 +0000 UTC" Apr 23 17:52:12.301729 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:12.301679 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15101h9m14.256585231s" Apr 23 17:52:12.396423 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:12.396369 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9pnhp" event={"ID":"4c608978-9ca3-4730-81a8-ed012e4601c4","Type":"ContainerStarted","Data":"a1f02280a10284e8b442cf201f9a566b0d77d375a2da4eda2aed9cacc365c465"} Apr 23 17:52:12.400895 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:12.400861 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z6prg" event={"ID":"82216d67-3ae3-4fd5-be5c-85a939836d44","Type":"ContainerStarted","Data":"e8231a3555cbcf032aa68447c91327eabe2ae06f34578516129561fc5e9b5d0c"} Apr 23 17:52:12.402897 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:12.402837 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" event={"ID":"40b803bf-2eb9-4d1b-aaf1-45331a946c46","Type":"ContainerStarted","Data":"7141758a54788d202a7e7139774f59271acf8146011f405006d007c812fec885"} Apr 23 17:52:12.407465 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:12.407411 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bkrt6" event={"ID":"712ef82b-3fe4-488d-9956-2e0264016fa7","Type":"ContainerStarted","Data":"fa43c5f8be6f02fb95d15ee947fcaacbe60212a4fa1208a6b1dd19684068b6cc"} Apr 23 17:52:13.302803 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:13.302757 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:47:11 +0000 UTC" deadline="2027-11-23 14:33:51.962370824 +0000 UTC" Apr 23 17:52:13.302803 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:13.302795 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13892h41m38.659579611s" Apr 23 17:52:16.416479 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:16.416447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-102.ec2.internal" event={"ID":"13a2aab92beaa8cd38c68b02321633e1","Type":"ContainerStarted","Data":"8a4ad9a592484a351afacaf7a26888d3bf47f271cdf364fd297fa0aa21753320"} Apr 23 17:52:16.417824 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:16.417799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z6prg" event={"ID":"82216d67-3ae3-4fd5-be5c-85a939836d44","Type":"ContainerStarted","Data":"7c6f5a4f74f1765ce15bed3d029b40657c57305d385d65b551b39eaa65a75e36"} Apr 23 17:52:16.428996 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:16.428936 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-102.ec2.internal" podStartSLOduration=5.428923578 podStartE2EDuration="5.428923578s" podCreationTimestamp="2026-04-23 17:52:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:52:16.428775997 +0000 UTC m=+6.664222702" watchObservedRunningTime="2026-04-23 17:52:16.428923578 +0000 UTC m=+6.664370281" Apr 23 17:52:16.448398 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:16.448345 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-z6prg" podStartSLOduration=1.935400527 podStartE2EDuration="6.448333022s" podCreationTimestamp="2026-04-23 17:52:10 +0000 UTC" firstStartedPulling="2026-04-23 17:52:11.616475783 +0000 UTC m=+1.851922472" lastFinishedPulling="2026-04-23 17:52:16.129408285 +0000 UTC m=+6.364854967" observedRunningTime="2026-04-23 17:52:16.448313684 +0000 UTC m=+6.683760388" watchObservedRunningTime="2026-04-23 17:52:16.448333022 +0000 UTC m=+6.683779726" Apr 23 17:52:17.421060 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:17.420819 2576 generic.go:358] "Generic (PLEG): container finished" podID="1a97617bff83f0fe8f7c02c491ea0ab2" containerID="c32fbe05ee29838cc2e40a8bf9e2cf8776ce3cf5692f66d538aadcd5d093fdbc" exitCode=0 Apr 23 17:52:17.421601 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:17.420917 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal" event={"ID":"1a97617bff83f0fe8f7c02c491ea0ab2","Type":"ContainerDied","Data":"c32fbe05ee29838cc2e40a8bf9e2cf8776ce3cf5692f66d538aadcd5d093fdbc"} Apr 23 17:52:17.422627 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:17.422599 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9pnhp" event={"ID":"4c608978-9ca3-4730-81a8-ed012e4601c4","Type":"ContainerStarted","Data":"a12a5cbc5e0e8400e3757bb9e97b3318d7ba44ee3c6472e7dcc8273f0b0bd8d7"} Apr 23 17:52:17.424877 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:17.424851 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" event={"ID":"40b803bf-2eb9-4d1b-aaf1-45331a946c46","Type":"ContainerStarted","Data":"4d6e52ed7e0892d970f1fc4296a846182ccfdc14b4d6c58ee6d0593aadf17087"} Apr 23 17:52:17.426198 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:17.426173 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bkrt6" event={"ID":"712ef82b-3fe4-488d-9956-2e0264016fa7","Type":"ContainerStarted","Data":"325604eddeab82e4898724e8395358784ed0de05b4640caeeb79b8b6e23c30fc"} Apr 23 17:52:17.446650 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:17.446593 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-bkrt6" podStartSLOduration=2.943102307 podStartE2EDuration="7.446576272s" podCreationTimestamp="2026-04-23 17:52:10 +0000 UTC" firstStartedPulling="2026-04-23 17:52:11.5836613 +0000 UTC m=+1.819107982" lastFinishedPulling="2026-04-23 17:52:16.087135262 +0000 UTC m=+6.322581947" observedRunningTime="2026-04-23 17:52:17.446563532 +0000 UTC m=+7.682010236" watchObservedRunningTime="2026-04-23 17:52:17.446576272 +0000 UTC m=+7.682022975" Apr 23 17:52:17.458515 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:17.458463 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9pnhp" podStartSLOduration=2.994489464 podStartE2EDuration="7.458446789s" podCreationTimestamp="2026-04-23 17:52:10 +0000 UTC" firstStartedPulling="2026-04-23 17:52:11.619305481 +0000 UTC m=+1.854752162" lastFinishedPulling="2026-04-23 17:52:16.083262802 +0000 UTC m=+6.318709487" observedRunningTime="2026-04-23 17:52:17.458241004 +0000 UTC m=+7.693687708" watchObservedRunningTime="2026-04-23 17:52:17.458446789 +0000 UTC m=+7.693893492" Apr 23 17:52:17.512863 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:17.512843 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 17:52:18.326614 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:18.326454 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T17:52:17.512858745Z","UUID":"d1730d8f-a0ec-4530-81dc-e659af72f12f","Handler":null,"Name":"","Endpoint":""} Apr 23 17:52:18.327872 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:18.327855 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 17:52:18.327984 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:18.327880 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 17:52:18.432674 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:18.432640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" event={"ID":"40b803bf-2eb9-4d1b-aaf1-45331a946c46","Type":"ContainerStarted","Data":"54f8129a14aef2d4f27d959d11cda3e3c7b6dcec9831a0f81b363d53f3dd5f39"} Apr 23 17:52:18.432674 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:18.432677 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" event={"ID":"40b803bf-2eb9-4d1b-aaf1-45331a946c46","Type":"ContainerStarted","Data":"dc09a8e50b783e625e91df1f9436388331134dffb5fc8af41911eb7da98710f3"} Apr 23 17:52:18.434110 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:18.434083 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal" event={"ID":"1a97617bff83f0fe8f7c02c491ea0ab2","Type":"ContainerStarted","Data":"a74f813085da825a52dad2c20e5b6f208237bafa298e39568be91e546b93ecb5"} Apr 23 17:52:18.455231 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:18.455187 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f5jrt" podStartSLOduration=1.832591468 podStartE2EDuration="8.455174389s" podCreationTimestamp="2026-04-23 17:52:10 +0000 UTC" firstStartedPulling="2026-04-23 17:52:11.588357862 +0000 UTC m=+1.823804543" lastFinishedPulling="2026-04-23 17:52:18.210940783 +0000 UTC m=+8.446387464" observedRunningTime="2026-04-23 17:52:18.454927925 +0000 UTC m=+8.690374630" watchObservedRunningTime="2026-04-23 17:52:18.455174389 +0000 UTC m=+8.690621092" Apr 23 17:52:18.470369 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:18.470296 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-102.ec2.internal" podStartSLOduration=7.470283947 podStartE2EDuration="7.470283947s" podCreationTimestamp="2026-04-23 17:52:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:52:18.470147588 +0000 UTC m=+8.705594291" watchObservedRunningTime="2026-04-23 17:52:18.470283947 +0000 UTC m=+8.705730650" Apr 23 17:52:18.703063 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:18.703024 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-bkrt6" Apr 23 17:52:18.703630 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:18.703613 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-bkrt6" Apr 23 17:52:20.436248 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:20.436219 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 17:52:24.772938 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:24.772899 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-q7mhh"] Apr 23 17:52:24.775225 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:24.775209 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:24.775299 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:24.775269 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:24.837241 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:24.837206 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c10ccf97-5e76-4972-b775-25d5b2e5a887-dbus\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:24.837335 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:24.837255 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c10ccf97-5e76-4972-b775-25d5b2e5a887-kubelet-config\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:24.837335 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:24.837283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:24.938099 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:24.938062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c10ccf97-5e76-4972-b775-25d5b2e5a887-dbus\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:24.938099 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:24.938101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c10ccf97-5e76-4972-b775-25d5b2e5a887-kubelet-config\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:24.938286 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:24.938119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:24.938286 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:24.938200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c10ccf97-5e76-4972-b775-25d5b2e5a887-kubelet-config\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:24.938286 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:24.938240 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c10ccf97-5e76-4972-b775-25d5b2e5a887-dbus\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:24.938286 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:24.938250 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:52:24.938417 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:24.938339 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret podName:c10ccf97-5e76-4972-b775-25d5b2e5a887 nodeName:}" failed. No retries permitted until 2026-04-23 17:52:25.438307494 +0000 UTC m=+15.673754189 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret") pod "global-pull-secret-syncer-q7mhh" (UID: "c10ccf97-5e76-4972-b775-25d5b2e5a887") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:52:25.441956 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:25.441924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:25.442167 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:25.442041 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:52:25.442167 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:25.442101 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret podName:c10ccf97-5e76-4972-b775-25d5b2e5a887 nodeName:}" failed. No retries permitted until 2026-04-23 17:52:26.442083807 +0000 UTC m=+16.677530493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret") pod "global-pull-secret-syncer-q7mhh" (UID: "c10ccf97-5e76-4972-b775-25d5b2e5a887") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:52:26.389830 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:26.389796 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:26.390259 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:26.389895 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:26.448302 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:26.448268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:26.448480 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:26.448360 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:52:26.448480 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:26.448409 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret podName:c10ccf97-5e76-4972-b775-25d5b2e5a887 nodeName:}" failed. No retries permitted until 2026-04-23 17:52:28.448396682 +0000 UTC m=+18.683843363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret") pod "global-pull-secret-syncer-q7mhh" (UID: "c10ccf97-5e76-4972-b775-25d5b2e5a887") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:52:28.389696 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:28.389659 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:28.390071 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:28.389787 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:28.459911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:28.459868 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:28.460078 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:28.459983 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:52:28.460078 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:28.460038 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret podName:c10ccf97-5e76-4972-b775-25d5b2e5a887 nodeName:}" failed. No retries permitted until 2026-04-23 17:52:32.460022542 +0000 UTC m=+22.695469224 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret") pod "global-pull-secret-syncer-q7mhh" (UID: "c10ccf97-5e76-4972-b775-25d5b2e5a887") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:52:30.389724 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:30.389687 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:30.390316 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:30.389805 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:32.059040 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:32.059012 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-bkrt6" Apr 23 17:52:32.059505 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:32.059133 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 17:52:32.059559 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:32.059534 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-bkrt6" Apr 23 17:52:32.389613 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:32.389577 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:32.389794 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:32.389698 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:32.483000 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:32.482959 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:32.483168 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:32.483106 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:52:32.483210 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:32.483185 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret podName:c10ccf97-5e76-4972-b775-25d5b2e5a887 nodeName:}" failed. No retries permitted until 2026-04-23 17:52:40.48316824 +0000 UTC m=+30.718614933 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret") pod "global-pull-secret-syncer-q7mhh" (UID: "c10ccf97-5e76-4972-b775-25d5b2e5a887") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:52:34.390249 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:34.390215 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:34.390733 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:34.390311 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:36.390244 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:36.390203 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:36.390615 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:36.390305 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:38.390167 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:38.390134 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:38.390544 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:38.390241 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:40.390186 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:40.390152 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:40.390581 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:40.390248 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:40.524227 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:40.524181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:40.524393 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:40.524294 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:52:40.524393 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:40.524343 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret podName:c10ccf97-5e76-4972-b775-25d5b2e5a887 nodeName:}" failed. No retries permitted until 2026-04-23 17:52:56.524331168 +0000 UTC m=+46.759777855 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret") pod "global-pull-secret-syncer-q7mhh" (UID: "c10ccf97-5e76-4972-b775-25d5b2e5a887") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:52:42.389540 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:42.389503 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:42.389962 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:42.389604 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:44.389834 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:44.389799 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:44.390219 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:44.389902 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:46.389495 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:46.389453 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:46.389904 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:46.389589 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:48.389822 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:48.389786 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:48.390228 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:48.389886 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:50.389568 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:50.389533 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:50.389953 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:50.389619 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:52.389460 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:52.389420 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:52.389983 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:52.389521 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:54.389870 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:54.389832 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:54.390357 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:54.389930 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:56.389681 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:56.389644 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:56.390124 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:56.389777 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:52:56.621844 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:56.621805 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:56.622040 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:56.621927 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:52:56.622040 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:56.621990 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret podName:c10ccf97-5e76-4972-b775-25d5b2e5a887 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:28.621973098 +0000 UTC m=+78.857419783 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret") pod "global-pull-secret-syncer-q7mhh" (UID: "c10ccf97-5e76-4972-b775-25d5b2e5a887") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:52:58.389645 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:52:58.389605 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:52:58.390133 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:52:58.389735 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:00.390368 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:00.390336 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:00.390729 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:00.390416 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:02.391576 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:02.391543 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:02.392022 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:02.391657 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:04.389755 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:04.389717 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:04.390258 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:04.389859 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:06.389844 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:06.389804 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:06.390328 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:06.389906 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:08.389873 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:08.389841 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:08.390246 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:08.389940 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:10.389685 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:10.389513 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:10.390064 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:10.389735 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:12.389417 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:12.389380 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:12.389805 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:12.389484 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:14.390034 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:14.390000 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:14.390394 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:14.390100 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:16.389351 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:16.389313 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:16.389852 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:16.389410 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:18.390142 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:18.390107 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:18.390507 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:18.390210 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:20.389873 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:20.389841 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:20.390264 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:20.389933 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:22.390242 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:22.390209 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:22.390660 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:22.390312 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:24.390283 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:24.390152 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:24.390283 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:24.390269 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:26.392203 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:26.392172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:26.392576 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:26.392269 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:28.389881 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:28.389847 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:28.390231 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:28.389948 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:28.702517 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:28.702424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:28.702659 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:28.702532 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:53:28.702659 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:28.702583 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret podName:c10ccf97-5e76-4972-b775-25d5b2e5a887 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:32.702569825 +0000 UTC m=+142.938016511 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret") pod "global-pull-secret-syncer-q7mhh" (UID: "c10ccf97-5e76-4972-b775-25d5b2e5a887") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:53:30.390786 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:30.390727 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:30.391467 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:30.391435 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:32.389683 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:32.389651 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:32.390090 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:32.389757 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:34.389728 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:34.389693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:34.390146 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:34.389815 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:36.389224 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:36.389189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:36.389669 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:36.389289 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:38.391605 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:38.391575 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:38.391994 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:38.391672 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:40.391821 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:40.391792 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:40.392290 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:40.392134 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:42.389444 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:42.389412 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:42.389844 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:42.389507 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:44.390188 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:44.390153 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:44.390562 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:44.390254 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:46.389511 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:46.389475 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:46.389932 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:46.389576 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:48.389472 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:48.389431 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:48.389931 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:48.389560 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:50.390010 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:50.389974 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:50.390377 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:50.390058 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:52.389865 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:52.389827 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:52.390293 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:52.389953 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:54.390201 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:54.390167 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:54.390633 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:54.390297 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:56.389276 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:56.389235 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:56.389647 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:56.389338 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:53:58.389722 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:53:58.389683 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:53:58.390192 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:53:58.389832 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:00.389802 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:00.389770 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:00.390168 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:00.389856 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:02.389876 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:02.389845 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:02.390343 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:02.389956 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:04.389449 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:04.389414 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:04.389834 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:04.389517 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:06.389877 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:06.389840 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:06.390318 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:06.389965 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:08.389557 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:08.389517 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:08.390154 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:08.389625 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:10.273294 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:10.273268 2576 kubelet_node_status.go:509] "Node not becoming ready in time after startup" Apr 23 17:54:10.344500 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:10.344466 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:10.389476 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:10.389441 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:10.389646 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:10.389547 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:12.389717 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:12.389671 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:12.390206 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:12.389794 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:14.390046 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:14.390009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:14.390515 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:14.390109 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:15.345914 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:15.345878 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:16.390241 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:16.390200 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:16.390717 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:16.390303 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:18.390089 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:18.390049 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:18.390468 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:18.390185 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:20.346790 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:20.346736 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:20.389627 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:20.389595 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:20.389797 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:20.389701 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:22.389894 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:22.389855 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:22.390250 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:22.389987 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:24.389440 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:24.389401 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:24.389829 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:24.389508 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:25.348018 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:25.347979 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:26.389430 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:26.389391 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:26.389932 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:26.389553 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:28.389537 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:28.389504 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:28.389950 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:28.389614 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:30.349192 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:30.349153 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:30.390023 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:30.389999 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:30.390114 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:30.390078 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:32.390013 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:32.389973 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:32.390433 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:32.390103 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:32.769562 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:32.769471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:32.769700 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:32.769577 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:54:32.769700 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:32.769674 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret podName:c10ccf97-5e76-4972-b775-25d5b2e5a887 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:34.769652833 +0000 UTC m=+265.005099520 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret") pod "global-pull-secret-syncer-q7mhh" (UID: "c10ccf97-5e76-4972-b775-25d5b2e5a887") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:54:34.392364 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:34.392331 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:34.392803 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:34.392447 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:35.349986 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:35.349947 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:36.390067 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:36.390034 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:36.390440 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:36.390135 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:38.390132 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:38.390094 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:38.390507 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:38.390208 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:40.350458 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:40.350420 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:40.390260 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:40.390235 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:40.390379 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:40.390310 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:42.390054 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:42.390012 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:42.390517 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:42.390234 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:44.389766 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:44.389716 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:44.390141 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:44.389849 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:45.351398 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:45.351334 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:46.389863 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:46.389827 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:46.390333 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:46.389940 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:48.390097 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:48.390064 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:48.390451 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:48.390166 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:50.351866 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:50.351827 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:50.389546 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:50.389515 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:50.389678 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:50.389617 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:52.389542 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:52.389504 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:52.389946 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:52.389648 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:54.389430 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:54.389350 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:54.389785 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:54.389476 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:55.352877 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:55.352841 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:56.389343 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:56.389307 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:56.389711 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:56.389408 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:54:58.389865 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:54:58.389831 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:54:58.390218 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:54:58.389933 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:00.353450 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:00.353416 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:00.390135 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:00.390108 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:00.390214 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:00.390200 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:02.389563 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.389525 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:02.389947 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:02.389635 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:02.805148 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.805067 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6g56n"] Apr 23 17:55:02.807617 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.807601 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.813109 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.813079 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 17:55:02.813223 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.813149 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 17:55:02.813603 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.813504 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-4kn7n\"" Apr 23 17:55:02.814763 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.813799 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 17:55:02.816636 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.816619 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 17:55:02.927105 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-cnibin\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927230 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927112 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-var-lib-kubelet\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927230 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927132 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-multus-socket-dir-parent\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927230 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927148 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ae56a92f-dfae-4763-b849-dca72bc2cf3d-multus-daemon-config\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927230 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927163 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-run-multus-certs\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927230 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927199 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-system-cni-dir\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927230 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-var-lib-cni-multus\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927438 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-multus-cni-dir\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927438 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927301 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-hostroot\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927438 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927329 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-multus-conf-dir\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927438 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927351 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx8dp\" (UniqueName: \"kubernetes.io/projected/ae56a92f-dfae-4763-b849-dca72bc2cf3d-kube-api-access-xx8dp\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927438 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ae56a92f-dfae-4763-b849-dca72bc2cf3d-cni-binary-copy\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927438 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-run-k8s-cni-cncf-io\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927438 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927413 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-var-lib-cni-bin\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927438 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927428 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-etc-kubernetes\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927716 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-os-release\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:02.927716 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:02.927483 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-run-netns\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.006847 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.006819 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-ncg5g"] Apr 23 17:55:03.009514 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.009499 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.012354 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.012330 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-sv4gq\"" Apr 23 17:55:03.012530 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.012510 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 17:55:03.014052 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.014035 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 17:55:03.028231 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ae56a92f-dfae-4763-b849-dca72bc2cf3d-cni-binary-copy\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028311 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-run-k8s-cni-cncf-io\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028311 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028256 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-var-lib-cni-bin\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028311 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-etc-kubernetes\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028311 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-os-release\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028311 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028304 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-run-netns\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-cnibin\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-etc-kubernetes\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028337 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-var-lib-kubelet\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-var-lib-cni-bin\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028342 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-run-k8s-cni-cncf-io\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028368 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-run-netns\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-multus-socket-dir-parent\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028391 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-cnibin\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ae56a92f-dfae-4763-b849-dca72bc2cf3d-multus-daemon-config\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028419 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-os-release\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028432 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-run-multus-certs\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028434 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-multus-socket-dir-parent\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028434 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-var-lib-kubelet\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-run-multus-certs\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-system-cni-dir\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.028552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-var-lib-cni-multus\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.029158 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028569 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-system-cni-dir\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.029158 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028589 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-multus-cni-dir\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.029158 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028610 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-host-var-lib-cni-multus\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.029158 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-hostroot\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.029158 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-multus-conf-dir\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.029158 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028653 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-multus-cni-dir\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.029158 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028652 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-hostroot\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.029158 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx8dp\" (UniqueName: \"kubernetes.io/projected/ae56a92f-dfae-4763-b849-dca72bc2cf3d-kube-api-access-xx8dp\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.029158 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.028676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ae56a92f-dfae-4763-b849-dca72bc2cf3d-multus-conf-dir\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.029486 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.029467 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ae56a92f-dfae-4763-b849-dca72bc2cf3d-multus-daemon-config\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.030022 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.030004 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ae56a92f-dfae-4763-b849-dca72bc2cf3d-cni-binary-copy\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.042911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.042884 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx8dp\" (UniqueName: \"kubernetes.io/projected/ae56a92f-dfae-4763-b849-dca72bc2cf3d-kube-api-access-xx8dp\") pod \"multus-6g56n\" (UID: \"ae56a92f-dfae-4763-b849-dca72bc2cf3d\") " pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.117929 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.117905 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6g56n" Apr 23 17:55:03.129731 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.129655 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hb4n\" (UniqueName: \"kubernetes.io/projected/0c55482f-ee0e-4a40-a959-7530a690f4c2-kube-api-access-7hb4n\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.129731 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.129683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c55482f-ee0e-4a40-a959-7530a690f4c2-system-cni-dir\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.129731 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.129700 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c55482f-ee0e-4a40-a959-7530a690f4c2-os-release\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.129731 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.129724 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c55482f-ee0e-4a40-a959-7530a690f4c2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.129939 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.129778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0c55482f-ee0e-4a40-a959-7530a690f4c2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.129939 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.129797 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c55482f-ee0e-4a40-a959-7530a690f4c2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.129939 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.129823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c55482f-ee0e-4a40-a959-7530a690f4c2-cnibin\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.129939 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.129862 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c55482f-ee0e-4a40-a959-7530a690f4c2-cni-binary-copy\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.230664 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.230626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hb4n\" (UniqueName: \"kubernetes.io/projected/0c55482f-ee0e-4a40-a959-7530a690f4c2-kube-api-access-7hb4n\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.230664 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.230663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c55482f-ee0e-4a40-a959-7530a690f4c2-system-cni-dir\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.230871 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.230679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c55482f-ee0e-4a40-a959-7530a690f4c2-os-release\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.230871 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.230725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c55482f-ee0e-4a40-a959-7530a690f4c2-system-cni-dir\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.230871 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.230795 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c55482f-ee0e-4a40-a959-7530a690f4c2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.230871 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.230816 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c55482f-ee0e-4a40-a959-7530a690f4c2-os-release\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.230871 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.230854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0c55482f-ee0e-4a40-a959-7530a690f4c2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.231044 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.230877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c55482f-ee0e-4a40-a959-7530a690f4c2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.231044 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.230903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c55482f-ee0e-4a40-a959-7530a690f4c2-cnibin\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.231044 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.230929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c55482f-ee0e-4a40-a959-7530a690f4c2-cni-binary-copy\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.231044 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.230962 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c55482f-ee0e-4a40-a959-7530a690f4c2-cnibin\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.231175 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.231094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c55482f-ee0e-4a40-a959-7530a690f4c2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.231346 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.231328 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0c55482f-ee0e-4a40-a959-7530a690f4c2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.232004 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.231987 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c55482f-ee0e-4a40-a959-7530a690f4c2-cni-binary-copy\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.232076 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.232060 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c55482f-ee0e-4a40-a959-7530a690f4c2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.240876 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.240852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hb4n\" (UniqueName: \"kubernetes.io/projected/0c55482f-ee0e-4a40-a959-7530a690f4c2-kube-api-access-7hb4n\") pod \"multus-additional-cni-plugins-ncg5g\" (UID: \"0c55482f-ee0e-4a40-a959-7530a690f4c2\") " pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.318806 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.318768 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ncg5g" Apr 23 17:55:03.326159 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:55:03.326132 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c55482f_ee0e_4a40_a959_7530a690f4c2.slice/crio-d9f65eef16ad4677393a70a6619d615ae3e2db97207268f547025790bec056d2 WatchSource:0}: Error finding container d9f65eef16ad4677393a70a6619d615ae3e2db97207268f547025790bec056d2: Status 404 returned error can't find the container with id d9f65eef16ad4677393a70a6619d615ae3e2db97207268f547025790bec056d2 Apr 23 17:55:03.668055 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.668016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6g56n" event={"ID":"ae56a92f-dfae-4763-b849-dca72bc2cf3d","Type":"ContainerStarted","Data":"661b173881bfbbf7bc4f3c2146db58117e4bb3e584347b9a178656660e3f6efe"} Apr 23 17:55:03.669134 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.669103 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ncg5g" event={"ID":"0c55482f-ee0e-4a40-a959-7530a690f4c2","Type":"ContainerStarted","Data":"d9f65eef16ad4677393a70a6619d615ae3e2db97207268f547025790bec056d2"} Apr 23 17:55:03.795027 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.794990 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-jfhpv"] Apr 23 17:55:03.797783 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.797761 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:03.797899 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:03.797840 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:03.937012 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.936922 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88vnc\" (UniqueName: \"kubernetes.io/projected/5baefb5e-77f1-440a-918c-82da4620b8d7-kube-api-access-88vnc\") pod \"network-metrics-daemon-jfhpv\" (UID: \"5baefb5e-77f1-440a-918c-82da4620b8d7\") " pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:03.937012 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:03.936974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs\") pod \"network-metrics-daemon-jfhpv\" (UID: \"5baefb5e-77f1-440a-918c-82da4620b8d7\") " pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:04.038008 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:04.037951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88vnc\" (UniqueName: \"kubernetes.io/projected/5baefb5e-77f1-440a-918c-82da4620b8d7-kube-api-access-88vnc\") pod \"network-metrics-daemon-jfhpv\" (UID: \"5baefb5e-77f1-440a-918c-82da4620b8d7\") " pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:04.038008 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:04.038011 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs\") pod \"network-metrics-daemon-jfhpv\" (UID: \"5baefb5e-77f1-440a-918c-82da4620b8d7\") " pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:04.038247 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:04.038154 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:04.038247 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:04.038206 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs podName:5baefb5e-77f1-440a-918c-82da4620b8d7 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:04.538191709 +0000 UTC m=+174.773638390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs") pod "network-metrics-daemon-jfhpv" (UID: "5baefb5e-77f1-440a-918c-82da4620b8d7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:04.048087 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:04.048029 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88vnc\" (UniqueName: \"kubernetes.io/projected/5baefb5e-77f1-440a-918c-82da4620b8d7-kube-api-access-88vnc\") pod \"network-metrics-daemon-jfhpv\" (UID: \"5baefb5e-77f1-440a-918c-82da4620b8d7\") " pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:04.389517 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:04.389480 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:04.389715 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:04.389623 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:04.542047 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:04.541963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs\") pod \"network-metrics-daemon-jfhpv\" (UID: \"5baefb5e-77f1-440a-918c-82da4620b8d7\") " pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:04.542237 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:04.542131 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:04.542237 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:04.542221 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs podName:5baefb5e-77f1-440a-918c-82da4620b8d7 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:05.542201711 +0000 UTC m=+175.777648398 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs") pod "network-metrics-daemon-jfhpv" (UID: "5baefb5e-77f1-440a-918c-82da4620b8d7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:05.354493 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:05.354455 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:05.389825 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:05.389787 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:05.389997 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:05.389923 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:05.548369 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:05.548326 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs\") pod \"network-metrics-daemon-jfhpv\" (UID: \"5baefb5e-77f1-440a-918c-82da4620b8d7\") " pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:05.548542 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:05.548449 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:05.548542 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:05.548504 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs podName:5baefb5e-77f1-440a-918c-82da4620b8d7 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:07.548489981 +0000 UTC m=+177.783936662 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs") pod "network-metrics-daemon-jfhpv" (UID: "5baefb5e-77f1-440a-918c-82da4620b8d7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:05.673871 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:05.673824 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c55482f-ee0e-4a40-a959-7530a690f4c2" containerID="6d1947c20602f933f7ac4841f1f8c1112de68e76602a2b9d560c2f66ac5f72f5" exitCode=0 Apr 23 17:55:05.674023 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:05.673885 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ncg5g" event={"ID":"0c55482f-ee0e-4a40-a959-7530a690f4c2","Type":"ContainerDied","Data":"6d1947c20602f933f7ac4841f1f8c1112de68e76602a2b9d560c2f66ac5f72f5"} Apr 23 17:55:06.389699 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:06.389661 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:06.390175 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:06.389972 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:07.389917 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:07.389881 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:07.390338 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:07.390135 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:07.561360 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:07.561253 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs\") pod \"network-metrics-daemon-jfhpv\" (UID: \"5baefb5e-77f1-440a-918c-82da4620b8d7\") " pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:07.561551 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:07.561368 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:07.561551 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:07.561457 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs podName:5baefb5e-77f1-440a-918c-82da4620b8d7 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:11.561439109 +0000 UTC m=+181.796885794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs") pod "network-metrics-daemon-jfhpv" (UID: "5baefb5e-77f1-440a-918c-82da4620b8d7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:08.389395 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:08.389360 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:08.389587 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:08.389496 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:09.389932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:09.389899 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:09.390314 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:09.390020 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:10.355498 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:10.355462 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:10.390448 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:10.390416 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:10.390933 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:10.390516 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:11.389682 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:11.389642 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:11.389968 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:11.389787 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:11.584029 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:11.583985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs\") pod \"network-metrics-daemon-jfhpv\" (UID: \"5baefb5e-77f1-440a-918c-82da4620b8d7\") " pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:11.584514 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:11.584179 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:11.584514 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:11.584285 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs podName:5baefb5e-77f1-440a-918c-82da4620b8d7 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:19.584261365 +0000 UTC m=+189.819708061 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs") pod "network-metrics-daemon-jfhpv" (UID: "5baefb5e-77f1-440a-918c-82da4620b8d7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:12.389864 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:12.389826 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:12.390035 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:12.389934 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:13.390097 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.389926 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:13.390466 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:13.390171 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:13.606346 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.606282 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gxfwt"] Apr 23 17:55:13.609805 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.609790 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.612920 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.612882 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 17:55:13.612920 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.612890 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 17:55:13.613101 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.612934 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 17:55:13.613101 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.612957 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 17:55:13.613101 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.612890 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 17:55:13.613101 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.612969 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 17:55:13.613351 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.613336 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kznb8\"" Apr 23 17:55:13.689339 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.689306 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c55482f-ee0e-4a40-a959-7530a690f4c2" containerID="cd86d23562ad8b450d959bcf8e91eba68180e5b33472cd5ecf860d70e658424f" exitCode=0 Apr 23 17:55:13.689523 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.689378 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ncg5g" event={"ID":"0c55482f-ee0e-4a40-a959-7530a690f4c2","Type":"ContainerDied","Data":"cd86d23562ad8b450d959bcf8e91eba68180e5b33472cd5ecf860d70e658424f"} Apr 23 17:55:13.690512 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.690489 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6g56n" event={"ID":"ae56a92f-dfae-4763-b849-dca72bc2cf3d","Type":"ContainerStarted","Data":"aca5adc6a19b0a329c7471b6ff92d6713f7c1042f723106b7edf813e54eb35d6"} Apr 23 17:55:13.694847 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.694828 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-run-ovn-kubernetes\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.694937 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.694857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-ovnkube-script-lib\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.694937 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.694888 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9530314a-cfd7-4042-95d5-610ca46c5b81-ovn-node-metrics-cert\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695047 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.694939 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-kubelet\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695047 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.694978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695047 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.695011 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-cni-netd\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695196 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.695049 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-log-socket\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695196 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.695074 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-openvswitch\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695196 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.695100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-systemd-units\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695196 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.695125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-run-netns\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695196 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.695147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-env-overrides\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695196 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.695171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-ovnkube-config\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695196 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.695194 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-ovn\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695489 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.695246 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-systemd\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695489 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.695273 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-var-lib-openvswitch\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695489 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.695299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-node-log\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695489 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.695324 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-slash\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695489 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.695347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-etc-openvswitch\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695489 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.695374 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-cni-bin\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.695489 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.695398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46q8k\" (UniqueName: \"kubernetes.io/projected/9530314a-cfd7-4042-95d5-610ca46c5b81-kube-api-access-46q8k\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.721549 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.721505 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6g56n" podStartSLOduration=1.8591494370000001 podStartE2EDuration="11.721494479s" podCreationTimestamp="2026-04-23 17:55:02 +0000 UTC" firstStartedPulling="2026-04-23 17:55:03.12719001 +0000 UTC m=+173.362636697" lastFinishedPulling="2026-04-23 17:55:12.989535041 +0000 UTC m=+183.224981739" observedRunningTime="2026-04-23 17:55:13.720970353 +0000 UTC m=+183.956417055" watchObservedRunningTime="2026-04-23 17:55:13.721494479 +0000 UTC m=+183.956941181" Apr 23 17:55:13.796204 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796163 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796204 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-cni-netd\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796425 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-cni-netd\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796425 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796280 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796425 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-log-socket\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796425 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-log-socket\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796425 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-openvswitch\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796653 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-systemd-units\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796653 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-run-netns\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796653 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-env-overrides\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796653 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796477 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-openvswitch\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796653 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796510 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-ovnkube-config\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796653 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-systemd-units\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796653 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-run-netns\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796653 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-ovn\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796653 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-ovn\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796653 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-systemd\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796653 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796633 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-var-lib-openvswitch\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.796653 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-node-log\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-slash\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-systemd\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-etc-openvswitch\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796730 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-node-log\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796733 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-cni-bin\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796790 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-cni-bin\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46q8k\" (UniqueName: \"kubernetes.io/projected/9530314a-cfd7-4042-95d5-610ca46c5b81-kube-api-access-46q8k\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-slash\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796842 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-etc-openvswitch\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796853 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-var-lib-openvswitch\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-run-ovn-kubernetes\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-ovnkube-script-lib\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.796999 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-env-overrides\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.797044 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-run-ovn-kubernetes\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.797083 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9530314a-cfd7-4042-95d5-610ca46c5b81-ovn-node-metrics-cert\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.797103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-ovnkube-config\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.797126 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-kubelet\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797774 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.797184 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-kubelet\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.797774 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.797399 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-ovnkube-script-lib\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.800141 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.800120 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9530314a-cfd7-4042-95d5-610ca46c5b81-ovn-node-metrics-cert\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.804916 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.804901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46q8k\" (UniqueName: \"kubernetes.io/projected/9530314a-cfd7-4042-95d5-610ca46c5b81-kube-api-access-46q8k\") pod \"ovnkube-node-gxfwt\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.918509 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:13.918475 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:13.925776 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:55:13.925729 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9530314a_cfd7_4042_95d5_610ca46c5b81.slice/crio-be52c23a569e5a707de9910403025ace5fb32a50c3bfc3fd2e6dec259be7f0a4 WatchSource:0}: Error finding container be52c23a569e5a707de9910403025ace5fb32a50c3bfc3fd2e6dec259be7f0a4: Status 404 returned error can't find the container with id be52c23a569e5a707de9910403025ace5fb32a50c3bfc3fd2e6dec259be7f0a4 Apr 23 17:55:14.389859 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:14.389827 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:14.390006 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:14.389961 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:14.694783 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:14.694672 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c55482f-ee0e-4a40-a959-7530a690f4c2" containerID="a986fe581b47822f8fd3668d430ec0c02edb9fd07924dcd19c7280bc6268dbd6" exitCode=0 Apr 23 17:55:14.695433 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:14.694786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ncg5g" event={"ID":"0c55482f-ee0e-4a40-a959-7530a690f4c2","Type":"ContainerDied","Data":"a986fe581b47822f8fd3668d430ec0c02edb9fd07924dcd19c7280bc6268dbd6"} Apr 23 17:55:14.696183 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:14.696137 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerStarted","Data":"be52c23a569e5a707de9910403025ace5fb32a50c3bfc3fd2e6dec259be7f0a4"} Apr 23 17:55:15.356931 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:15.356883 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:15.389735 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:15.389707 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:15.389907 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:15.389864 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:16.390198 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:16.390161 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:16.390576 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:16.390337 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:16.592578 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:16.592536 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-x77gx"] Apr 23 17:55:16.595279 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:16.595259 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:16.595393 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:16.595338 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:16.701821 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:16.701727 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c55482f-ee0e-4a40-a959-7530a690f4c2" containerID="14f3b5adce8d98e7dd4200590f62d4445536baab9d92b3a23012947c07d31ab9" exitCode=0 Apr 23 17:55:16.701821 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:16.701782 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ncg5g" event={"ID":"0c55482f-ee0e-4a40-a959-7530a690f4c2","Type":"ContainerDied","Data":"14f3b5adce8d98e7dd4200590f62d4445536baab9d92b3a23012947c07d31ab9"} Apr 23 17:55:16.719952 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:16.719919 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stggz\" (UniqueName: \"kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz\") pod \"network-check-target-x77gx\" (UID: \"41ba5b02-a248-4259-8ca2-8f501349c1b3\") " pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:16.821036 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:16.820996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stggz\" (UniqueName: \"kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz\") pod \"network-check-target-x77gx\" (UID: \"41ba5b02-a248-4259-8ca2-8f501349c1b3\") " pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:16.828854 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:16.828819 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:16.828854 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:16.828852 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:16.828854 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:16.828864 2576 projected.go:194] Error preparing data for projected volume kube-api-access-stggz for pod openshift-network-diagnostics/network-check-target-x77gx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:16.829059 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:16.828920 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz podName:41ba5b02-a248-4259-8ca2-8f501349c1b3 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:17.328904301 +0000 UTC m=+187.564350982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-stggz" (UniqueName: "kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz") pod "network-check-target-x77gx" (UID: "41ba5b02-a248-4259-8ca2-8f501349c1b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:17.389810 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:17.389774 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:17.390000 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:17.389929 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:17.427238 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:17.427189 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stggz\" (UniqueName: \"kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz\") pod \"network-check-target-x77gx\" (UID: \"41ba5b02-a248-4259-8ca2-8f501349c1b3\") " pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:17.427695 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:17.427347 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:17.427695 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:17.427366 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:17.427695 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:17.427379 2576 projected.go:194] Error preparing data for projected volume kube-api-access-stggz for pod openshift-network-diagnostics/network-check-target-x77gx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:17.427695 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:17.427437 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz podName:41ba5b02-a248-4259-8ca2-8f501349c1b3 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:18.427418739 +0000 UTC m=+188.662865422 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-stggz" (UniqueName: "kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz") pod "network-check-target-x77gx" (UID: "41ba5b02-a248-4259-8ca2-8f501349c1b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:18.389842 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:18.389803 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:18.390105 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:18.389938 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:18.390105 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:18.390019 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:18.390228 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:18.390129 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:18.436169 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:18.436046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stggz\" (UniqueName: \"kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz\") pod \"network-check-target-x77gx\" (UID: \"41ba5b02-a248-4259-8ca2-8f501349c1b3\") " pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:18.436563 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:18.436190 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:18.436563 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:18.436215 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:18.436563 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:18.436232 2576 projected.go:194] Error preparing data for projected volume kube-api-access-stggz for pod openshift-network-diagnostics/network-check-target-x77gx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:18.436563 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:18.436301 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz podName:41ba5b02-a248-4259-8ca2-8f501349c1b3 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:20.436282027 +0000 UTC m=+190.671728724 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-stggz" (UniqueName: "kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz") pod "network-check-target-x77gx" (UID: "41ba5b02-a248-4259-8ca2-8f501349c1b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:19.390189 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:19.390148 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:19.390375 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:19.390321 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:19.646924 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:19.646826 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs\") pod \"network-metrics-daemon-jfhpv\" (UID: \"5baefb5e-77f1-440a-918c-82da4620b8d7\") " pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:19.647381 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:19.646972 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:19.647381 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:19.647042 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs podName:5baefb5e-77f1-440a-918c-82da4620b8d7 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:35.647023982 +0000 UTC m=+205.882470676 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs") pod "network-metrics-daemon-jfhpv" (UID: "5baefb5e-77f1-440a-918c-82da4620b8d7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:19.987204 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:19.987110 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-8rcgt"] Apr 23 17:55:20.026070 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.026041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8rcgt" Apr 23 17:55:20.030541 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.030289 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 17:55:20.030690 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.030553 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:55:20.030896 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.030878 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 17:55:20.030972 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.030909 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-d25k5\"" Apr 23 17:55:20.150933 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.150893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d34dc40a-b3d7-4330-a3aa-7c90a9055d36-host-slash\") pod \"iptables-alerter-8rcgt\" (UID: \"d34dc40a-b3d7-4330-a3aa-7c90a9055d36\") " pod="openshift-network-operator/iptables-alerter-8rcgt" Apr 23 17:55:20.151112 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.151036 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d34dc40a-b3d7-4330-a3aa-7c90a9055d36-iptables-alerter-script\") pod \"iptables-alerter-8rcgt\" (UID: \"d34dc40a-b3d7-4330-a3aa-7c90a9055d36\") " pod="openshift-network-operator/iptables-alerter-8rcgt" Apr 23 17:55:20.151112 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.151071 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf26z\" (UniqueName: \"kubernetes.io/projected/d34dc40a-b3d7-4330-a3aa-7c90a9055d36-kube-api-access-wf26z\") pod \"iptables-alerter-8rcgt\" (UID: \"d34dc40a-b3d7-4330-a3aa-7c90a9055d36\") " pod="openshift-network-operator/iptables-alerter-8rcgt" Apr 23 17:55:20.251730 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.251644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d34dc40a-b3d7-4330-a3aa-7c90a9055d36-iptables-alerter-script\") pod \"iptables-alerter-8rcgt\" (UID: \"d34dc40a-b3d7-4330-a3aa-7c90a9055d36\") " pod="openshift-network-operator/iptables-alerter-8rcgt" Apr 23 17:55:20.251730 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.251690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wf26z\" (UniqueName: \"kubernetes.io/projected/d34dc40a-b3d7-4330-a3aa-7c90a9055d36-kube-api-access-wf26z\") pod \"iptables-alerter-8rcgt\" (UID: \"d34dc40a-b3d7-4330-a3aa-7c90a9055d36\") " pod="openshift-network-operator/iptables-alerter-8rcgt" Apr 23 17:55:20.251966 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.251736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d34dc40a-b3d7-4330-a3aa-7c90a9055d36-host-slash\") pod \"iptables-alerter-8rcgt\" (UID: \"d34dc40a-b3d7-4330-a3aa-7c90a9055d36\") " pod="openshift-network-operator/iptables-alerter-8rcgt" Apr 23 17:55:20.251966 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.251841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d34dc40a-b3d7-4330-a3aa-7c90a9055d36-host-slash\") pod \"iptables-alerter-8rcgt\" (UID: \"d34dc40a-b3d7-4330-a3aa-7c90a9055d36\") " pod="openshift-network-operator/iptables-alerter-8rcgt" Apr 23 17:55:20.252229 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.252209 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d34dc40a-b3d7-4330-a3aa-7c90a9055d36-iptables-alerter-script\") pod \"iptables-alerter-8rcgt\" (UID: \"d34dc40a-b3d7-4330-a3aa-7c90a9055d36\") " pod="openshift-network-operator/iptables-alerter-8rcgt" Apr 23 17:55:20.262668 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.262643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf26z\" (UniqueName: \"kubernetes.io/projected/d34dc40a-b3d7-4330-a3aa-7c90a9055d36-kube-api-access-wf26z\") pod \"iptables-alerter-8rcgt\" (UID: \"d34dc40a-b3d7-4330-a3aa-7c90a9055d36\") " pod="openshift-network-operator/iptables-alerter-8rcgt" Apr 23 17:55:20.338176 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.338139 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8rcgt" Apr 23 17:55:20.358117 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:20.358073 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:20.390499 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.390469 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:20.390641 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:20.390580 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:20.390710 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.390641 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:20.390782 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:20.390755 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:20.453533 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:20.453498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stggz\" (UniqueName: \"kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz\") pod \"network-check-target-x77gx\" (UID: \"41ba5b02-a248-4259-8ca2-8f501349c1b3\") " pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:20.453718 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:20.453674 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:20.453718 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:20.453692 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:20.453718 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:20.453701 2576 projected.go:194] Error preparing data for projected volume kube-api-access-stggz for pod openshift-network-diagnostics/network-check-target-x77gx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:20.453895 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:20.453778 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz podName:41ba5b02-a248-4259-8ca2-8f501349c1b3 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:24.453749184 +0000 UTC m=+194.689195880 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-stggz" (UniqueName: "kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz") pod "network-check-target-x77gx" (UID: "41ba5b02-a248-4259-8ca2-8f501349c1b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:21.389576 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:21.389540 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:21.390117 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:21.389684 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:22.389496 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:22.389454 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:22.389801 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:22.389585 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:22.389801 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:22.389639 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:22.389801 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:22.389759 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:23.389795 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:23.389754 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:23.390258 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:23.389885 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:24.389504 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:24.389471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:24.389705 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:24.389472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:24.389705 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:24.389627 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:24.389705 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:24.389675 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:24.481878 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:24.481837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stggz\" (UniqueName: \"kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz\") pod \"network-check-target-x77gx\" (UID: \"41ba5b02-a248-4259-8ca2-8f501349c1b3\") " pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:24.482064 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:24.481968 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:24.482064 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:24.481990 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:24.482064 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:24.481999 2576 projected.go:194] Error preparing data for projected volume kube-api-access-stggz for pod openshift-network-diagnostics/network-check-target-x77gx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:24.482064 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:24.482056 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz podName:41ba5b02-a248-4259-8ca2-8f501349c1b3 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:32.482036106 +0000 UTC m=+202.717482796 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-stggz" (UniqueName: "kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz") pod "network-check-target-x77gx" (UID: "41ba5b02-a248-4259-8ca2-8f501349c1b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:25.359265 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:25.359219 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:25.389622 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:25.389585 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:25.389810 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:25.389724 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:26.392876 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:26.392846 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:26.393365 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:26.392854 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:26.393365 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:26.392951 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:26.393365 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:26.393017 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:27.200536 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:55:27.200501 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd34dc40a_b3d7_4330_a3aa_7c90a9055d36.slice/crio-5cd0d78f11aed2088f0bdab0e4101c6153b6d1d139937e330766f289b60a3139 WatchSource:0}: Error finding container 5cd0d78f11aed2088f0bdab0e4101c6153b6d1d139937e330766f289b60a3139: Status 404 returned error can't find the container with id 5cd0d78f11aed2088f0bdab0e4101c6153b6d1d139937e330766f289b60a3139 Apr 23 17:55:27.389867 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:27.389834 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:27.390014 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:27.389984 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:27.725117 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:27.724884 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c55482f-ee0e-4a40-a959-7530a690f4c2" containerID="423c65798f51a54a0648e26a472c1a065453f2f6c8322d53aec0830906b8f1cd" exitCode=0 Apr 23 17:55:27.725469 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:27.724965 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ncg5g" event={"ID":"0c55482f-ee0e-4a40-a959-7530a690f4c2","Type":"ContainerDied","Data":"423c65798f51a54a0648e26a472c1a065453f2f6c8322d53aec0830906b8f1cd"} Apr 23 17:55:27.727488 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:27.727456 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerStarted","Data":"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289"} Apr 23 17:55:27.727568 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:27.727489 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerStarted","Data":"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056"} Apr 23 17:55:27.727568 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:27.727505 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerStarted","Data":"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453"} Apr 23 17:55:27.727568 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:27.727515 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerStarted","Data":"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4"} Apr 23 17:55:27.728567 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:27.728545 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8rcgt" event={"ID":"d34dc40a-b3d7-4330-a3aa-7c90a9055d36","Type":"ContainerStarted","Data":"5cd0d78f11aed2088f0bdab0e4101c6153b6d1d139937e330766f289b60a3139"} Apr 23 17:55:28.390132 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:28.390097 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:28.390313 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:28.390097 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:28.390313 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:28.390228 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:28.390313 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:28.390290 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:28.733414 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:28.733373 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c55482f-ee0e-4a40-a959-7530a690f4c2" containerID="a68bb19270b3df73c0412517675b05cdaca17a2bc8d6923257aa65b11d0dcd1f" exitCode=0 Apr 23 17:55:28.733927 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:28.733465 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ncg5g" event={"ID":"0c55482f-ee0e-4a40-a959-7530a690f4c2","Type":"ContainerDied","Data":"a68bb19270b3df73c0412517675b05cdaca17a2bc8d6923257aa65b11d0dcd1f"} Apr 23 17:55:28.736247 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:28.736222 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerStarted","Data":"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf"} Apr 23 17:55:28.736350 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:28.736256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerStarted","Data":"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3"} Apr 23 17:55:29.389444 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:29.389418 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:29.389606 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:29.389521 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:29.740267 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:29.740188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ncg5g" event={"ID":"0c55482f-ee0e-4a40-a959-7530a690f4c2","Type":"ContainerStarted","Data":"c0932acfb5072f255c7e48ceb862930510b917c4f05f708d45438d9568f9f081"} Apr 23 17:55:29.759797 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:29.759755 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ncg5g" podStartSLOduration=3.887444419 podStartE2EDuration="27.759724334s" podCreationTimestamp="2026-04-23 17:55:02 +0000 UTC" firstStartedPulling="2026-04-23 17:55:03.32755655 +0000 UTC m=+173.563003236" lastFinishedPulling="2026-04-23 17:55:27.199836468 +0000 UTC m=+197.435283151" observedRunningTime="2026-04-23 17:55:29.759410509 +0000 UTC m=+199.994857211" watchObservedRunningTime="2026-04-23 17:55:29.759724334 +0000 UTC m=+199.995171036" Apr 23 17:55:30.360462 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:30.360250 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:30.390173 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:30.390151 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:30.390281 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:30.390230 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:30.390281 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:30.390239 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:30.390364 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:30.390343 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:30.744347 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:30.744312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerStarted","Data":"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae"} Apr 23 17:55:30.745423 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:30.745401 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8rcgt" event={"ID":"d34dc40a-b3d7-4330-a3aa-7c90a9055d36","Type":"ContainerStarted","Data":"14c500f4934483b2ce83489e4ed2182a098188854927dabe276bb7587f6a68d2"} Apr 23 17:55:30.759613 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:30.759565 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8rcgt" podStartSLOduration=9.55510857 podStartE2EDuration="11.759554652s" podCreationTimestamp="2026-04-23 17:55:19 +0000 UTC" firstStartedPulling="2026-04-23 17:55:27.20225698 +0000 UTC m=+197.437703668" lastFinishedPulling="2026-04-23 17:55:29.406703066 +0000 UTC m=+199.642149750" observedRunningTime="2026-04-23 17:55:30.759228768 +0000 UTC m=+200.994675483" watchObservedRunningTime="2026-04-23 17:55:30.759554652 +0000 UTC m=+200.995001354" Apr 23 17:55:31.389549 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:31.389498 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:31.389773 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:31.389630 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:32.390101 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:32.390069 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:32.390766 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:32.390069 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:32.390766 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:32.390173 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:32.390766 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:32.390283 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:32.539170 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:32.539001 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stggz\" (UniqueName: \"kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz\") pod \"network-check-target-x77gx\" (UID: \"41ba5b02-a248-4259-8ca2-8f501349c1b3\") " pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:32.539170 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:32.539158 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:32.539170 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:32.539179 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:32.539351 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:32.539189 2576 projected.go:194] Error preparing data for projected volume kube-api-access-stggz for pod openshift-network-diagnostics/network-check-target-x77gx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:32.539351 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:32.539243 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz podName:41ba5b02-a248-4259-8ca2-8f501349c1b3 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:48.539224183 +0000 UTC m=+218.774670877 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-stggz" (UniqueName: "kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz") pod "network-check-target-x77gx" (UID: "41ba5b02-a248-4259-8ca2-8f501349c1b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:32.751078 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:32.750995 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerStarted","Data":"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591"} Apr 23 17:55:32.751284 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:32.751269 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:32.751338 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:32.751294 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:32.765718 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:32.765693 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:32.778663 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:32.778626 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" podStartSLOduration=6.463747338 podStartE2EDuration="19.778614229s" podCreationTimestamp="2026-04-23 17:55:13 +0000 UTC" firstStartedPulling="2026-04-23 17:55:13.927295869 +0000 UTC m=+184.162742549" lastFinishedPulling="2026-04-23 17:55:27.242162759 +0000 UTC m=+197.477609440" observedRunningTime="2026-04-23 17:55:32.778190865 +0000 UTC m=+203.013637573" watchObservedRunningTime="2026-04-23 17:55:32.778614229 +0000 UTC m=+203.014060931" Apr 23 17:55:33.389942 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:33.389907 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:33.390172 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:33.390023 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:33.753607 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:33.753533 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:33.769513 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:33.769483 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:34.026936 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:34.026862 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q7mhh"] Apr 23 17:55:34.027068 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:34.026959 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:34.027110 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:34.027064 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:34.030568 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:34.030532 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x77gx"] Apr 23 17:55:34.030706 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:34.030649 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:34.030805 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:34.030776 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:34.031881 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:34.031841 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jfhpv"] Apr 23 17:55:34.032028 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:34.031976 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:34.032138 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:34.032104 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:35.361242 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:35.361208 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:35.389511 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:35.389466 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:35.389621 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:35.389466 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:35.389699 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:35.389680 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:35.389734 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:35.389709 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:35.659309 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:35.659275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs\") pod \"network-metrics-daemon-jfhpv\" (UID: \"5baefb5e-77f1-440a-918c-82da4620b8d7\") " pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:35.659489 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:35.659381 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:35.659489 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:35.659432 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs podName:5baefb5e-77f1-440a-918c-82da4620b8d7 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:07.659418069 +0000 UTC m=+237.894864750 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs") pod "network-metrics-daemon-jfhpv" (UID: "5baefb5e-77f1-440a-918c-82da4620b8d7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:36.389973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:36.389879 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:36.390395 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:36.390010 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:37.389920 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:37.389887 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:37.390080 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:37.389894 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:37.390080 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:37.390000 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:37.390377 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:37.390094 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:38.390129 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.390097 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:38.390482 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:38.390218 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:38.476167 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.476135 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gxfwt"] Apr 23 17:55:38.476598 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.476575 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="ovn-controller" containerID="cri-o://e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4" gracePeriod=30 Apr 23 17:55:38.476649 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.476623 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="nbdb" containerID="cri-o://32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf" gracePeriod=30 Apr 23 17:55:38.476713 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.476629 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289" gracePeriod=30 Apr 23 17:55:38.476713 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.476671 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="northd" containerID="cri-o://4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3" gracePeriod=30 Apr 23 17:55:38.476833 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.476720 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="ovn-acl-logging" containerID="cri-o://6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453" gracePeriod=30 Apr 23 17:55:38.476833 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.476710 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="sbdb" containerID="cri-o://4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae" gracePeriod=30 Apr 23 17:55:38.476833 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.476735 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="kube-rbac-proxy-node" containerID="cri-o://c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056" gracePeriod=30 Apr 23 17:55:38.493423 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.493264 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="ovnkube-controller" containerID="cri-o://61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591" gracePeriod=30 Apr 23 17:55:38.493423 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.493265 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="ovnkube-controller" probeResult="failure" output="" Apr 23 17:55:38.705894 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.705872 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxfwt_9530314a-cfd7-4042-95d5-610ca46c5b81/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 17:55:38.706248 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.706234 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxfwt_9530314a-cfd7-4042-95d5-610ca46c5b81/kube-rbac-proxy-node/0.log" Apr 23 17:55:38.706534 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.706520 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxfwt_9530314a-cfd7-4042-95d5-610ca46c5b81/ovn-acl-logging/0.log" Apr 23 17:55:38.706915 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.706899 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxfwt_9530314a-cfd7-4042-95d5-610ca46c5b81/ovn-controller/0.log" Apr 23 17:55:38.707038 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.707025 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:38.753696 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753655 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tph9h"] Apr 23 17:55:38.753911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753852 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="kube-rbac-proxy-node" Apr 23 17:55:38.753911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753868 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="kube-rbac-proxy-node" Apr 23 17:55:38.753911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753880 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="sbdb" Apr 23 17:55:38.753911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753887 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="sbdb" Apr 23 17:55:38.753911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753895 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="ovnkube-controller" Apr 23 17:55:38.753911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753902 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="ovnkube-controller" Apr 23 17:55:38.753911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753913 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="ovn-controller" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753923 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="ovn-controller" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753933 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="kube-rbac-proxy-ovn-metrics" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753942 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="kube-rbac-proxy-ovn-metrics" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753951 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="northd" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753958 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="northd" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753972 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="ovn-acl-logging" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753979 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="ovn-acl-logging" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753988 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="nbdb" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.753995 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="nbdb" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.754034 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="ovnkube-controller" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.754046 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="ovn-controller" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.754055 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="ovn-acl-logging" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.754064 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="kube-rbac-proxy-ovn-metrics" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.754073 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="nbdb" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.754082 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="sbdb" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.754090 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="kube-rbac-proxy-node" Apr 23 17:55:38.754252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.754100 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerName="northd" Apr 23 17:55:38.757710 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.757690 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.770031 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.770009 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxfwt_9530314a-cfd7-4042-95d5-610ca46c5b81/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 17:55:38.770528 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.770514 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxfwt_9530314a-cfd7-4042-95d5-610ca46c5b81/kube-rbac-proxy-node/0.log" Apr 23 17:55:38.770929 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.770913 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxfwt_9530314a-cfd7-4042-95d5-610ca46c5b81/ovn-acl-logging/0.log" Apr 23 17:55:38.771319 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771306 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxfwt_9530314a-cfd7-4042-95d5-610ca46c5b81/ovn-controller/0.log" Apr 23 17:55:38.771378 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771341 2576 generic.go:358] "Generic (PLEG): container finished" podID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerID="61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591" exitCode=0 Apr 23 17:55:38.771378 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771357 2576 generic.go:358] "Generic (PLEG): container finished" podID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerID="4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae" exitCode=0 Apr 23 17:55:38.771378 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771366 2576 generic.go:358] "Generic (PLEG): container finished" podID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerID="32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf" exitCode=0 Apr 23 17:55:38.771378 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771374 2576 generic.go:358] "Generic (PLEG): container finished" podID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerID="4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3" exitCode=0 Apr 23 17:55:38.771378 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771380 2576 generic.go:358] "Generic (PLEG): container finished" podID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerID="8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289" exitCode=143 Apr 23 17:55:38.771563 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771385 2576 generic.go:358] "Generic (PLEG): container finished" podID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerID="c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056" exitCode=143 Apr 23 17:55:38.771563 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771390 2576 generic.go:358] "Generic (PLEG): container finished" podID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerID="6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453" exitCode=143 Apr 23 17:55:38.771563 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771395 2576 generic.go:358] "Generic (PLEG): container finished" podID="9530314a-cfd7-4042-95d5-610ca46c5b81" containerID="e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4" exitCode=143 Apr 23 17:55:38.771563 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerDied","Data":"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591"} Apr 23 17:55:38.771563 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771458 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerDied","Data":"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae"} Apr 23 17:55:38.771563 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerDied","Data":"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf"} Apr 23 17:55:38.771563 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771478 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerDied","Data":"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3"} Apr 23 17:55:38.771563 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771483 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" Apr 23 17:55:38.771563 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771499 2576 scope.go:117] "RemoveContainer" containerID="61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591" Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771488 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerDied","Data":"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771596 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerDied","Data":"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771611 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerDied","Data":"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771621 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771645 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771650 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771655 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771659 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771665 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerDied","Data":"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771675 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771684 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771690 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771694 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771698 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771702 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771706 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771710 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxfwt" event={"ID":"9530314a-cfd7-4042-95d5-610ca46c5b81","Type":"ContainerDied","Data":"be52c23a569e5a707de9910403025ace5fb32a50c3bfc3fd2e6dec259be7f0a4"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771725 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771730 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771734 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771751 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771757 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771764 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771771 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453"} Apr 23 17:55:38.771869 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.771777 2576 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4"} Apr 23 17:55:38.777358 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777335 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-systemd\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.777433 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777399 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-ovn\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.777433 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777425 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-etc-openvswitch\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.777545 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777445 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-openvswitch\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.777545 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777463 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-node-log\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.777545 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777487 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-run-ovn-kubernetes\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.777545 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777493 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:38.777545 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777501 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:38.777545 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777513 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-ovnkube-script-lib\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.777545 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777491 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:38.777545 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777533 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-ovnkube-config\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.777545 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777529 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-node-log" (OuterVolumeSpecName: "node-log") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:38.777911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777532 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:38.777911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777554 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-slash\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.777911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777582 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-slash" (OuterVolumeSpecName: "host-slash") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:38.777911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777606 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9530314a-cfd7-4042-95d5-610ca46c5b81-ovn-node-metrics-cert\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.777911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777632 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-kubelet\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.777911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777696 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:38.777911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777781 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-var-lib-openvswitch\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.777911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-ovnkube-config\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.777911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777838 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-env-overrides\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.777911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777856 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:55:38.777911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777866 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-slash\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.777911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-run-openvswitch\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.777911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777901 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:55:38.778446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777938 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-ovn-node-metrics-cert\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.778446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.777964 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmkmj\" (UniqueName: \"kubernetes.io/projected/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-kube-api-access-vmkmj\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.778446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778041 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-run-systemd\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.778446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778085 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-run-netns\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.778446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-kubelet\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.778446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-node-log\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.778446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-log-socket\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.778446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778182 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-cni-netd\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.778446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778203 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-ovnkube-script-lib\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.778446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-systemd-units\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.778446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778244 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-etc-openvswitch\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.778446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778266 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.778446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778313 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-cni-bin\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.778446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-run-ovn\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.778446 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778430 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-run-ovn-kubernetes\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.779026 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778479 2576 reconciler_common.go:299] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-ovn\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.779026 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778496 2576 reconciler_common.go:299] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-etc-openvswitch\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.779026 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778512 2576 reconciler_common.go:299] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-openvswitch\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.779026 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778526 2576 reconciler_common.go:299] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-node-log\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.779026 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778541 2576 reconciler_common.go:299] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-run-ovn-kubernetes\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.779026 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778555 2576 reconciler_common.go:299] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-ovnkube-script-lib\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.779026 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778569 2576 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-ovnkube-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.779026 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778584 2576 reconciler_common.go:299] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-slash\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.779026 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.778597 2576 reconciler_common.go:299] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-kubelet\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.780092 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.780075 2576 scope.go:117] "RemoveContainer" containerID="4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae" Apr 23 17:55:38.781559 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.781531 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9530314a-cfd7-4042-95d5-610ca46c5b81-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:55:38.782402 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.782384 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:38.786128 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.786112 2576 scope.go:117] "RemoveContainer" containerID="32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf" Apr 23 17:55:38.791444 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.791427 2576 scope.go:117] "RemoveContainer" containerID="4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3" Apr 23 17:55:38.796606 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.796589 2576 scope.go:117] "RemoveContainer" containerID="8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289" Apr 23 17:55:38.801960 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.801939 2576 scope.go:117] "RemoveContainer" containerID="c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056" Apr 23 17:55:38.807085 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.807063 2576 scope.go:117] "RemoveContainer" containerID="6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453" Apr 23 17:55:38.813188 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.813172 2576 scope.go:117] "RemoveContainer" containerID="e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4" Apr 23 17:55:38.818966 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.818949 2576 scope.go:117] "RemoveContainer" containerID="61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591" Apr 23 17:55:38.819214 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:38.819196 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591\": container with ID starting with 61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591 not found: ID does not exist" containerID="61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591" Apr 23 17:55:38.819257 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.819223 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591"} err="failed to get container status \"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591\": rpc error: code = NotFound desc = could not find container \"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591\": container with ID starting with 61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591 not found: ID does not exist" Apr 23 17:55:38.819257 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.819240 2576 scope.go:117] "RemoveContainer" containerID="4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae" Apr 23 17:55:38.819435 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:38.819421 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae\": container with ID starting with 4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae not found: ID does not exist" containerID="4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae" Apr 23 17:55:38.819474 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.819438 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae"} err="failed to get container status \"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae\": rpc error: code = NotFound desc = could not find container \"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae\": container with ID starting with 4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae not found: ID does not exist" Apr 23 17:55:38.819474 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.819450 2576 scope.go:117] "RemoveContainer" containerID="32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf" Apr 23 17:55:38.819647 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:38.819630 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf\": container with ID starting with 32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf not found: ID does not exist" containerID="32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf" Apr 23 17:55:38.819690 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.819651 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf"} err="failed to get container status \"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf\": rpc error: code = NotFound desc = could not find container \"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf\": container with ID starting with 32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf not found: ID does not exist" Apr 23 17:55:38.819690 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.819662 2576 scope.go:117] "RemoveContainer" containerID="4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3" Apr 23 17:55:38.819923 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:38.819906 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3\": container with ID starting with 4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3 not found: ID does not exist" containerID="4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3" Apr 23 17:55:38.820015 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.819927 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3"} err="failed to get container status \"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3\": rpc error: code = NotFound desc = could not find container \"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3\": container with ID starting with 4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3 not found: ID does not exist" Apr 23 17:55:38.820015 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.819939 2576 scope.go:117] "RemoveContainer" containerID="8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289" Apr 23 17:55:38.820130 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:38.820118 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289\": container with ID starting with 8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289 not found: ID does not exist" containerID="8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289" Apr 23 17:55:38.820168 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.820133 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289"} err="failed to get container status \"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289\": rpc error: code = NotFound desc = could not find container \"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289\": container with ID starting with 8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289 not found: ID does not exist" Apr 23 17:55:38.820168 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.820144 2576 scope.go:117] "RemoveContainer" containerID="c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056" Apr 23 17:55:38.820374 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:38.820356 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056\": container with ID starting with c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056 not found: ID does not exist" containerID="c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056" Apr 23 17:55:38.820407 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.820382 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056"} err="failed to get container status \"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056\": rpc error: code = NotFound desc = could not find container \"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056\": container with ID starting with c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056 not found: ID does not exist" Apr 23 17:55:38.820407 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.820397 2576 scope.go:117] "RemoveContainer" containerID="6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453" Apr 23 17:55:38.820620 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:38.820604 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453\": container with ID starting with 6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453 not found: ID does not exist" containerID="6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453" Apr 23 17:55:38.820665 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.820625 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453"} err="failed to get container status \"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453\": rpc error: code = NotFound desc = could not find container \"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453\": container with ID starting with 6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453 not found: ID does not exist" Apr 23 17:55:38.820665 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.820642 2576 scope.go:117] "RemoveContainer" containerID="e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4" Apr 23 17:55:38.820882 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:38.820864 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4\": container with ID starting with e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4 not found: ID does not exist" containerID="e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4" Apr 23 17:55:38.820935 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.820885 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4"} err="failed to get container status \"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4\": rpc error: code = NotFound desc = could not find container \"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4\": container with ID starting with e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4 not found: ID does not exist" Apr 23 17:55:38.820935 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.820898 2576 scope.go:117] "RemoveContainer" containerID="61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591" Apr 23 17:55:38.821114 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.821096 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591"} err="failed to get container status \"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591\": rpc error: code = NotFound desc = could not find container \"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591\": container with ID starting with 61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591 not found: ID does not exist" Apr 23 17:55:38.821150 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.821116 2576 scope.go:117] "RemoveContainer" containerID="4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae" Apr 23 17:55:38.821301 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.821281 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae"} err="failed to get container status \"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae\": rpc error: code = NotFound desc = could not find container \"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae\": container with ID starting with 4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae not found: ID does not exist" Apr 23 17:55:38.821345 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.821302 2576 scope.go:117] "RemoveContainer" containerID="32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf" Apr 23 17:55:38.821485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.821468 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf"} err="failed to get container status \"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf\": rpc error: code = NotFound desc = could not find container \"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf\": container with ID starting with 32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf not found: ID does not exist" Apr 23 17:55:38.821524 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.821485 2576 scope.go:117] "RemoveContainer" containerID="4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3" Apr 23 17:55:38.821687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.821670 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3"} err="failed to get container status \"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3\": rpc error: code = NotFound desc = could not find container \"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3\": container with ID starting with 4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3 not found: ID does not exist" Apr 23 17:55:38.821733 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.821688 2576 scope.go:117] "RemoveContainer" containerID="8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289" Apr 23 17:55:38.821879 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.821863 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289"} err="failed to get container status \"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289\": rpc error: code = NotFound desc = could not find container \"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289\": container with ID starting with 8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289 not found: ID does not exist" Apr 23 17:55:38.821927 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.821880 2576 scope.go:117] "RemoveContainer" containerID="c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056" Apr 23 17:55:38.822057 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.822034 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056"} err="failed to get container status \"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056\": rpc error: code = NotFound desc = could not find container \"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056\": container with ID starting with c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056 not found: ID does not exist" Apr 23 17:55:38.822100 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.822057 2576 scope.go:117] "RemoveContainer" containerID="6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453" Apr 23 17:55:38.822229 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.822209 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453"} err="failed to get container status \"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453\": rpc error: code = NotFound desc = could not find container \"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453\": container with ID starting with 6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453 not found: ID does not exist" Apr 23 17:55:38.822264 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.822231 2576 scope.go:117] "RemoveContainer" containerID="e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4" Apr 23 17:55:38.822392 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.822375 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4"} err="failed to get container status \"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4\": rpc error: code = NotFound desc = could not find container \"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4\": container with ID starting with e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4 not found: ID does not exist" Apr 23 17:55:38.822429 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.822392 2576 scope.go:117] "RemoveContainer" containerID="61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591" Apr 23 17:55:38.822556 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.822542 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591"} err="failed to get container status \"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591\": rpc error: code = NotFound desc = could not find container \"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591\": container with ID starting with 61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591 not found: ID does not exist" Apr 23 17:55:38.822597 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.822557 2576 scope.go:117] "RemoveContainer" containerID="4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae" Apr 23 17:55:38.822717 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.822702 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae"} err="failed to get container status \"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae\": rpc error: code = NotFound desc = could not find container \"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae\": container with ID starting with 4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae not found: ID does not exist" Apr 23 17:55:38.822775 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.822717 2576 scope.go:117] "RemoveContainer" containerID="32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf" Apr 23 17:55:38.822888 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.822869 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf"} err="failed to get container status \"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf\": rpc error: code = NotFound desc = could not find container \"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf\": container with ID starting with 32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf not found: ID does not exist" Apr 23 17:55:38.822947 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.822888 2576 scope.go:117] "RemoveContainer" containerID="4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3" Apr 23 17:55:38.823115 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.823087 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3"} err="failed to get container status \"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3\": rpc error: code = NotFound desc = could not find container \"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3\": container with ID starting with 4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3 not found: ID does not exist" Apr 23 17:55:38.823115 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.823113 2576 scope.go:117] "RemoveContainer" containerID="8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289" Apr 23 17:55:38.823314 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.823297 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289"} err="failed to get container status \"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289\": rpc error: code = NotFound desc = could not find container \"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289\": container with ID starting with 8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289 not found: ID does not exist" Apr 23 17:55:38.823362 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.823314 2576 scope.go:117] "RemoveContainer" containerID="c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056" Apr 23 17:55:38.823529 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.823514 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056"} err="failed to get container status \"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056\": rpc error: code = NotFound desc = could not find container \"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056\": container with ID starting with c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056 not found: ID does not exist" Apr 23 17:55:38.823578 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.823529 2576 scope.go:117] "RemoveContainer" containerID="6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453" Apr 23 17:55:38.823702 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.823689 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453"} err="failed to get container status \"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453\": rpc error: code = NotFound desc = could not find container \"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453\": container with ID starting with 6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453 not found: ID does not exist" Apr 23 17:55:38.823761 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.823711 2576 scope.go:117] "RemoveContainer" containerID="e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4" Apr 23 17:55:38.823892 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.823877 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4"} err="failed to get container status \"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4\": rpc error: code = NotFound desc = could not find container \"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4\": container with ID starting with e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4 not found: ID does not exist" Apr 23 17:55:38.823953 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.823891 2576 scope.go:117] "RemoveContainer" containerID="61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591" Apr 23 17:55:38.824070 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.824051 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591"} err="failed to get container status \"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591\": rpc error: code = NotFound desc = could not find container \"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591\": container with ID starting with 61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591 not found: ID does not exist" Apr 23 17:55:38.824112 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.824072 2576 scope.go:117] "RemoveContainer" containerID="4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae" Apr 23 17:55:38.824254 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.824240 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae"} err="failed to get container status \"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae\": rpc error: code = NotFound desc = could not find container \"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae\": container with ID starting with 4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae not found: ID does not exist" Apr 23 17:55:38.824289 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.824256 2576 scope.go:117] "RemoveContainer" containerID="32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf" Apr 23 17:55:38.824433 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.824413 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf"} err="failed to get container status \"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf\": rpc error: code = NotFound desc = could not find container \"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf\": container with ID starting with 32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf not found: ID does not exist" Apr 23 17:55:38.824471 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.824435 2576 scope.go:117] "RemoveContainer" containerID="4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3" Apr 23 17:55:38.824609 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.824594 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3"} err="failed to get container status \"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3\": rpc error: code = NotFound desc = could not find container \"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3\": container with ID starting with 4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3 not found: ID does not exist" Apr 23 17:55:38.824653 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.824609 2576 scope.go:117] "RemoveContainer" containerID="8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289" Apr 23 17:55:38.824780 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.824764 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289"} err="failed to get container status \"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289\": rpc error: code = NotFound desc = could not find container \"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289\": container with ID starting with 8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289 not found: ID does not exist" Apr 23 17:55:38.824818 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.824780 2576 scope.go:117] "RemoveContainer" containerID="c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056" Apr 23 17:55:38.824959 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.824945 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056"} err="failed to get container status \"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056\": rpc error: code = NotFound desc = could not find container \"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056\": container with ID starting with c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056 not found: ID does not exist" Apr 23 17:55:38.825008 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.824959 2576 scope.go:117] "RemoveContainer" containerID="6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453" Apr 23 17:55:38.825132 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.825114 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453"} err="failed to get container status \"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453\": rpc error: code = NotFound desc = could not find container \"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453\": container with ID starting with 6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453 not found: ID does not exist" Apr 23 17:55:38.825132 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.825133 2576 scope.go:117] "RemoveContainer" containerID="e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4" Apr 23 17:55:38.825310 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.825288 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4"} err="failed to get container status \"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4\": rpc error: code = NotFound desc = could not find container \"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4\": container with ID starting with e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4 not found: ID does not exist" Apr 23 17:55:38.825310 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.825303 2576 scope.go:117] "RemoveContainer" containerID="61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591" Apr 23 17:55:38.825474 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.825460 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591"} err="failed to get container status \"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591\": rpc error: code = NotFound desc = could not find container \"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591\": container with ID starting with 61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591 not found: ID does not exist" Apr 23 17:55:38.825474 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.825475 2576 scope.go:117] "RemoveContainer" containerID="4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae" Apr 23 17:55:38.825637 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.825623 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae"} err="failed to get container status \"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae\": rpc error: code = NotFound desc = could not find container \"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae\": container with ID starting with 4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae not found: ID does not exist" Apr 23 17:55:38.825637 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.825637 2576 scope.go:117] "RemoveContainer" containerID="32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf" Apr 23 17:55:38.825872 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.825853 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf"} err="failed to get container status \"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf\": rpc error: code = NotFound desc = could not find container \"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf\": container with ID starting with 32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf not found: ID does not exist" Apr 23 17:55:38.825946 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.825874 2576 scope.go:117] "RemoveContainer" containerID="4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3" Apr 23 17:55:38.826077 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.826060 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3"} err="failed to get container status \"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3\": rpc error: code = NotFound desc = could not find container \"4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3\": container with ID starting with 4e6a0d77a4a5c856c4477f47f18f2f1d657660ea545b4921d1edc09b5e517ba3 not found: ID does not exist" Apr 23 17:55:38.826122 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.826076 2576 scope.go:117] "RemoveContainer" containerID="8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289" Apr 23 17:55:38.826252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.826238 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289"} err="failed to get container status \"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289\": rpc error: code = NotFound desc = could not find container \"8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289\": container with ID starting with 8acaf4de6fb7b0bfa83020ce0c25c5c083a2d24c880946708043c2ba0edad289 not found: ID does not exist" Apr 23 17:55:38.826252 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.826250 2576 scope.go:117] "RemoveContainer" containerID="c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056" Apr 23 17:55:38.826441 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.826424 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056"} err="failed to get container status \"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056\": rpc error: code = NotFound desc = could not find container \"c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056\": container with ID starting with c792690db43a2a08709d9c8f8321e4e9da1a0ffa388d882896a51aed4db95056 not found: ID does not exist" Apr 23 17:55:38.826487 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.826442 2576 scope.go:117] "RemoveContainer" containerID="6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453" Apr 23 17:55:38.826639 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.826625 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453"} err="failed to get container status \"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453\": rpc error: code = NotFound desc = could not find container \"6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453\": container with ID starting with 6dac19800ef82baf2a23f69ac2369e46afc7ae380656a76f97e3be4ba7541453 not found: ID does not exist" Apr 23 17:55:38.826689 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.826638 2576 scope.go:117] "RemoveContainer" containerID="e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4" Apr 23 17:55:38.826831 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.826812 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4"} err="failed to get container status \"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4\": rpc error: code = NotFound desc = could not find container \"e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4\": container with ID starting with e9a68c630203be3b7e2bc2c925b70f00e9961baeb42da731c3d11c386d250ec4 not found: ID does not exist" Apr 23 17:55:38.826881 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.826831 2576 scope.go:117] "RemoveContainer" containerID="61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591" Apr 23 17:55:38.827020 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.827002 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591"} err="failed to get container status \"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591\": rpc error: code = NotFound desc = could not find container \"61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591\": container with ID starting with 61be7f1d04e6b31a2825caaa31786c9fa63dd701680349c43dee18144e951591 not found: ID does not exist" Apr 23 17:55:38.827069 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.827020 2576 scope.go:117] "RemoveContainer" containerID="4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae" Apr 23 17:55:38.827202 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.827187 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae"} err="failed to get container status \"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae\": rpc error: code = NotFound desc = could not find container \"4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae\": container with ID starting with 4faf3e09e60be1415ee59b9b429d1e48c47b74ca76d517696f5fed40a92deaae not found: ID does not exist" Apr 23 17:55:38.827247 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.827202 2576 scope.go:117] "RemoveContainer" containerID="32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf" Apr 23 17:55:38.827375 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.827361 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf"} err="failed to get container status \"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf\": rpc error: code = NotFound desc = could not find container \"32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf\": container with ID starting with 32c7012e2b9d81c99141ccf87cba0e6be7824d9f5ec0c0dcf537684e887aecdf not found: ID does not exist" Apr 23 17:55:38.879730 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.879696 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-log-socket\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.879889 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.879759 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-run-netns\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.879889 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.879786 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-systemd-units\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.879889 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.879790 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-log-socket" (OuterVolumeSpecName: "log-socket") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:38.879889 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.879800 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-var-lib-openvswitch\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.879889 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.879822 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:38.879889 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.879848 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:38.879889 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.879850 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:38.879889 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.879863 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-env-overrides\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.879889 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.879891 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46q8k\" (UniqueName: \"kubernetes.io/projected/9530314a-cfd7-4042-95d5-610ca46c5b81-kube-api-access-46q8k\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.879908 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-cni-bin\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.879928 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-var-lib-cni-networks-ovn-kubernetes\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.879946 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.879960 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-cni-netd\") pod \"9530314a-cfd7-4042-95d5-610ca46c5b81\" (UID: \"9530314a-cfd7-4042-95d5-610ca46c5b81\") " Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.879968 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-run-systemd\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880054 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-run-netns\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-kubelet\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880124 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-node-log\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880134 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-run-systemd\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-log-socket\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880163 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880177 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-run-netns\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-cni-netd\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880202 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-kubelet\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880287 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-node-log\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-ovnkube-script-lib\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880228 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-cni-netd\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880226 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-log-socket\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880252 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-systemd-units\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880278 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-etc-openvswitch\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-cni-bin\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880349 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-systemd-units\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880350 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-etc-openvswitch\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880361 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-run-ovn\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-cni-bin\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880393 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-run-ovn\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-run-ovn-kubernetes\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880442 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-var-lib-openvswitch\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880447 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-run-ovn-kubernetes\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.880973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880470 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-ovnkube-config\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880481 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-var-lib-openvswitch\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880495 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-env-overrides\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-slash\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-run-openvswitch\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880577 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-ovn-node-metrics-cert\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-host-slash\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmkmj\" (UniqueName: \"kubernetes.io/projected/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-kube-api-access-vmkmj\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880640 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-run-openvswitch\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880646 2576 reconciler_common.go:299] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-log-socket\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880662 2576 reconciler_common.go:299] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-run-systemd\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880677 2576 reconciler_common.go:299] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9530314a-cfd7-4042-95d5-610ca46c5b81-ovn-node-metrics-cert\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880692 2576 reconciler_common.go:299] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-run-netns\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880708 2576 reconciler_common.go:299] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-systemd-units\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880720 2576 reconciler_common.go:299] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-var-lib-openvswitch\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880731 2576 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9530314a-cfd7-4042-95d5-610ca46c5b81-env-overrides\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880766 2576 reconciler_common.go:299] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-cni-bin\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880779 2576 reconciler_common.go:299] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-var-lib-cni-networks-ovn-kubernetes\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880792 2576 reconciler_common.go:299] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9530314a-cfd7-4042-95d5-610ca46c5b81-host-cni-netd\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-ovnkube-script-lib\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.881485 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.880917 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-env-overrides\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.882135 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.881172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-ovnkube-config\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.882135 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.882072 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9530314a-cfd7-4042-95d5-610ca46c5b81-kube-api-access-46q8k" (OuterVolumeSpecName: "kube-api-access-46q8k") pod "9530314a-cfd7-4042-95d5-610ca46c5b81" (UID: "9530314a-cfd7-4042-95d5-610ca46c5b81"). InnerVolumeSpecName "kube-api-access-46q8k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:55:38.882824 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.882808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-ovn-node-metrics-cert\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.894733 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.894712 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmkmj\" (UniqueName: \"kubernetes.io/projected/518ae3f8-909f-4ac9-932b-cf6c27fde0e0-kube-api-access-vmkmj\") pod \"ovnkube-node-tph9h\" (UID: \"518ae3f8-909f-4ac9-932b-cf6c27fde0e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:38.981966 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:38.981891 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-46q8k\" (UniqueName: \"kubernetes.io/projected/9530314a-cfd7-4042-95d5-610ca46c5b81-kube-api-access-46q8k\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:55:39.070047 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.070005 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:39.113411 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.113384 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gxfwt"] Apr 23 17:55:39.118732 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.118708 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gxfwt"] Apr 23 17:55:39.389973 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.389946 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:39.390087 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.389946 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:39.390087 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:39.390044 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:39.390153 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:39.390103 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:39.774547 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.774476 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6g56n_ae56a92f-dfae-4763-b849-dca72bc2cf3d/kube-multus/0.log" Apr 23 17:55:39.774547 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.774511 2576 generic.go:358] "Generic (PLEG): container finished" podID="ae56a92f-dfae-4763-b849-dca72bc2cf3d" containerID="aca5adc6a19b0a329c7471b6ff92d6713f7c1042f723106b7edf813e54eb35d6" exitCode=2 Apr 23 17:55:39.774729 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.774577 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6g56n" event={"ID":"ae56a92f-dfae-4763-b849-dca72bc2cf3d","Type":"ContainerDied","Data":"aca5adc6a19b0a329c7471b6ff92d6713f7c1042f723106b7edf813e54eb35d6"} Apr 23 17:55:39.775002 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.774986 2576 scope.go:117] "RemoveContainer" containerID="aca5adc6a19b0a329c7471b6ff92d6713f7c1042f723106b7edf813e54eb35d6" Apr 23 17:55:39.777757 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.777719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" event={"ID":"518ae3f8-909f-4ac9-932b-cf6c27fde0e0","Type":"ContainerStarted","Data":"dbc0d830c6c6c8d0f3ec532613c46a462e78738d5fce70e3254e8db4110daa47"} Apr 23 17:55:39.777875 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.777765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" event={"ID":"518ae3f8-909f-4ac9-932b-cf6c27fde0e0","Type":"ContainerStarted","Data":"cd6484008380fd11449d366e4bd163755e8ec6f517561b4796e957ea8c139f66"} Apr 23 17:55:39.777875 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.777780 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" event={"ID":"518ae3f8-909f-4ac9-932b-cf6c27fde0e0","Type":"ContainerStarted","Data":"b15ac76e2c6ada4140eed504c4ad472e11945e2724faf4b9d8e4d7273692e78f"} Apr 23 17:55:39.777875 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.777792 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" event={"ID":"518ae3f8-909f-4ac9-932b-cf6c27fde0e0","Type":"ContainerStarted","Data":"31ce471497fe285b8b2fd2b0f567eed872229c9cfcc997ec05b2cc8a72a02169"} Apr 23 17:55:39.777875 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.777803 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" event={"ID":"518ae3f8-909f-4ac9-932b-cf6c27fde0e0","Type":"ContainerStarted","Data":"fadc1339c1c583a84578b8844b450b73271e2f34d7fb4d624cee78cd392bb534"} Apr 23 17:55:39.777875 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.777814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" event={"ID":"518ae3f8-909f-4ac9-932b-cf6c27fde0e0","Type":"ContainerStarted","Data":"554114734cdb7fa749f771fb2b0848176a99e323210dbcda6a7554fcc404ed09"} Apr 23 17:55:39.777875 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:39.777826 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" event={"ID":"518ae3f8-909f-4ac9-932b-cf6c27fde0e0","Type":"ContainerStarted","Data":"82629e92e7c738fafb177bc31788aa63f72f5b81aa8cdf2fbf20d0e86180eb9d"} Apr 23 17:55:40.362341 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:40.362301 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:40.390496 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:40.390461 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:40.391075 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:40.390548 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:40.392541 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:40.392515 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9530314a-cfd7-4042-95d5-610ca46c5b81" path="/var/lib/kubelet/pods/9530314a-cfd7-4042-95d5-610ca46c5b81/volumes" Apr 23 17:55:40.780687 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:40.780661 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6g56n_ae56a92f-dfae-4763-b849-dca72bc2cf3d/kube-multus/0.log" Apr 23 17:55:40.780841 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:40.780726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6g56n" event={"ID":"ae56a92f-dfae-4763-b849-dca72bc2cf3d","Type":"ContainerStarted","Data":"8f7b23264c5219c35ba3f42e72e834e217555efae9ee2af7332531f004b7eb2c"} Apr 23 17:55:41.389708 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:41.389674 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:41.389890 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:41.389681 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:41.389890 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:41.389819 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:41.389890 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:41.389854 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:41.785104 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:41.785017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" event={"ID":"518ae3f8-909f-4ac9-932b-cf6c27fde0e0","Type":"ContainerStarted","Data":"955c7d2c780f5ab5cc8c19c3fd046d9f62619564798b1a3a3edd7b284c899e6c"} Apr 23 17:55:42.389891 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:42.389702 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:42.390042 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:42.389973 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:43.389898 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:43.389861 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:43.390328 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:43.389861 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:43.390328 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:43.389989 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:43.390328 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:43.390038 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:44.389583 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:44.389544 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:44.389734 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:44.389657 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:44.796929 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:44.796848 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" event={"ID":"518ae3f8-909f-4ac9-932b-cf6c27fde0e0","Type":"ContainerStarted","Data":"f320d4575d8b6eaa0028dc7ea02d10e93c4de72dab904095d0ade01aeb0e0e27"} Apr 23 17:55:44.797305 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:44.797130 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:44.797305 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:44.797158 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:44.811613 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:44.811584 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:44.826383 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:44.826344 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" podStartSLOduration=6.826333445 podStartE2EDuration="6.826333445s" podCreationTimestamp="2026-04-23 17:55:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:55:44.825910229 +0000 UTC m=+215.061356932" watchObservedRunningTime="2026-04-23 17:55:44.826333445 +0000 UTC m=+215.061780147" Apr 23 17:55:45.363084 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:45.363049 2576 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:45.389347 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:45.389316 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:45.389494 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:45.389316 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:45.389494 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:45.389440 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:45.389590 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:45.389538 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:45.799032 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:45.798996 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:45.812526 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:45.812504 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:55:46.390012 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:46.389981 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:46.390146 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:46.390090 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:47.389579 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:47.389546 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:47.389579 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:47.389566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:47.390015 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:47.389799 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:47.390015 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:47.389870 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:48.389599 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:48.389519 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:48.389954 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:48.389777 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q7mhh" podUID="c10ccf97-5e76-4972-b775-25d5b2e5a887" Apr 23 17:55:48.542429 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:48.542395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stggz\" (UniqueName: \"kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz\") pod \"network-check-target-x77gx\" (UID: \"41ba5b02-a248-4259-8ca2-8f501349c1b3\") " pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:48.542567 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:48.542545 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:48.542567 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:48.542564 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:48.542627 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:48.542574 2576 projected.go:194] Error preparing data for projected volume kube-api-access-stggz for pod openshift-network-diagnostics/network-check-target-x77gx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:48.542627 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:48.542623 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz podName:41ba5b02-a248-4259-8ca2-8f501349c1b3 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:20.542609196 +0000 UTC m=+250.778055880 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-stggz" (UniqueName: "kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz") pod "network-check-target-x77gx" (UID: "41ba5b02-a248-4259-8ca2-8f501349c1b3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:49.390211 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.390172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:49.390588 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.390172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:49.390588 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:49.390286 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x77gx" podUID="41ba5b02-a248-4259-8ca2-8f501349c1b3" Apr 23 17:55:49.390588 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:49.390387 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jfhpv" podUID="5baefb5e-77f1-440a-918c-82da4620b8d7" Apr 23 17:55:49.664482 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.664452 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-n7pdd"] Apr 23 17:55:49.698562 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.698535 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-n7pdd" Apr 23 17:55:49.702480 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.702456 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 17:55:49.702602 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.702460 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 17:55:49.702864 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.702851 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qcv89\"" Apr 23 17:55:49.749382 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.749348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrfkd\" (UniqueName: \"kubernetes.io/projected/3c4a21a3-0078-4632-bce8-ee31a63bceb2-kube-api-access-vrfkd\") pod \"node-resolver-n7pdd\" (UID: \"3c4a21a3-0078-4632-bce8-ee31a63bceb2\") " pod="openshift-dns/node-resolver-n7pdd" Apr 23 17:55:49.749552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.749410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3c4a21a3-0078-4632-bce8-ee31a63bceb2-hosts-file\") pod \"node-resolver-n7pdd\" (UID: \"3c4a21a3-0078-4632-bce8-ee31a63bceb2\") " pod="openshift-dns/node-resolver-n7pdd" Apr 23 17:55:49.749552 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.749446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3c4a21a3-0078-4632-bce8-ee31a63bceb2-tmp-dir\") pod \"node-resolver-n7pdd\" (UID: \"3c4a21a3-0078-4632-bce8-ee31a63bceb2\") " pod="openshift-dns/node-resolver-n7pdd" Apr 23 17:55:49.850638 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.850600 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3c4a21a3-0078-4632-bce8-ee31a63bceb2-hosts-file\") pod \"node-resolver-n7pdd\" (UID: \"3c4a21a3-0078-4632-bce8-ee31a63bceb2\") " pod="openshift-dns/node-resolver-n7pdd" Apr 23 17:55:49.850837 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.850660 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3c4a21a3-0078-4632-bce8-ee31a63bceb2-tmp-dir\") pod \"node-resolver-n7pdd\" (UID: \"3c4a21a3-0078-4632-bce8-ee31a63bceb2\") " pod="openshift-dns/node-resolver-n7pdd" Apr 23 17:55:49.850837 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.850679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrfkd\" (UniqueName: \"kubernetes.io/projected/3c4a21a3-0078-4632-bce8-ee31a63bceb2-kube-api-access-vrfkd\") pod \"node-resolver-n7pdd\" (UID: \"3c4a21a3-0078-4632-bce8-ee31a63bceb2\") " pod="openshift-dns/node-resolver-n7pdd" Apr 23 17:55:49.850837 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.850734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3c4a21a3-0078-4632-bce8-ee31a63bceb2-hosts-file\") pod \"node-resolver-n7pdd\" (UID: \"3c4a21a3-0078-4632-bce8-ee31a63bceb2\") " pod="openshift-dns/node-resolver-n7pdd" Apr 23 17:55:49.863979 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.863957 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrfkd\" (UniqueName: \"kubernetes.io/projected/3c4a21a3-0078-4632-bce8-ee31a63bceb2-kube-api-access-vrfkd\") pod \"node-resolver-n7pdd\" (UID: \"3c4a21a3-0078-4632-bce8-ee31a63bceb2\") " pod="openshift-dns/node-resolver-n7pdd" Apr 23 17:55:49.864149 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:49.864132 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3c4a21a3-0078-4632-bce8-ee31a63bceb2-tmp-dir\") pod \"node-resolver-n7pdd\" (UID: \"3c4a21a3-0078-4632-bce8-ee31a63bceb2\") " pod="openshift-dns/node-resolver-n7pdd" Apr 23 17:55:50.007232 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:50.007147 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-n7pdd" Apr 23 17:55:50.014575 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:55:50.014541 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c4a21a3_0078_4632_bce8_ee31a63bceb2.slice/crio-9a40160347babf8764373288bfac2b5aa8762cf45eebb42b83ac3d62d1e6d5f6 WatchSource:0}: Error finding container 9a40160347babf8764373288bfac2b5aa8762cf45eebb42b83ac3d62d1e6d5f6: Status 404 returned error can't find the container with id 9a40160347babf8764373288bfac2b5aa8762cf45eebb42b83ac3d62d1e6d5f6 Apr 23 17:55:50.390944 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:50.390912 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:55:50.393299 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:50.393277 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 17:55:50.809260 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:50.809168 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-n7pdd" event={"ID":"3c4a21a3-0078-4632-bce8-ee31a63bceb2","Type":"ContainerStarted","Data":"f830d98a70ae4bef04e41fbee8b6da62f6ab02613f4b8e872c72291ce9b4cefe"} Apr 23 17:55:50.809260 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:50.809208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-n7pdd" event={"ID":"3c4a21a3-0078-4632-bce8-ee31a63bceb2","Type":"ContainerStarted","Data":"9a40160347babf8764373288bfac2b5aa8762cf45eebb42b83ac3d62d1e6d5f6"} Apr 23 17:55:51.390184 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:51.390148 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:55:51.390359 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:51.390145 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:55:51.392324 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:51.392300 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:55:51.392904 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:51.392882 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-brp64\"" Apr 23 17:55:51.393024 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:51.392907 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-s85dw\"" Apr 23 17:55:51.393024 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:51.392920 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:55:51.393024 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:51.392883 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:55:54.617903 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.617873 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-102.ec2.internal" event="NodeReady" Apr 23 17:55:54.668310 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.668247 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-n7pdd" podStartSLOduration=5.668232858 podStartE2EDuration="5.668232858s" podCreationTimestamp="2026-04-23 17:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:55:50.824048447 +0000 UTC m=+221.059495149" watchObservedRunningTime="2026-04-23 17:55:54.668232858 +0000 UTC m=+224.903679561" Apr 23 17:55:54.668588 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.668575 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fmtwc"] Apr 23 17:55:54.678577 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.678554 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fmtwc"] Apr 23 17:55:54.678703 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.678650 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fmtwc" Apr 23 17:55:54.681710 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.681686 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lcwv7"] Apr 23 17:55:54.682787 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.682711 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 17:55:54.682787 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.682722 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 17:55:54.682949 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.682792 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 17:55:54.682949 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.682868 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wh8vz\"" Apr 23 17:55:54.693148 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.693128 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lcwv7"] Apr 23 17:55:54.693242 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.693231 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lcwv7" Apr 23 17:55:54.695174 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.695156 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4gw5f\"" Apr 23 17:55:54.695284 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.695156 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 17:55:54.695284 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.695242 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 17:55:54.782270 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.782239 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert\") pod \"ingress-canary-fmtwc\" (UID: \"bfd74eb8-918a-45f2-abb0-8342a3e4ebc4\") " pod="openshift-ingress-canary/ingress-canary-fmtwc" Apr 23 17:55:54.782270 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.782282 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5n8z\" (UniqueName: \"kubernetes.io/projected/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-kube-api-access-x5n8z\") pod \"ingress-canary-fmtwc\" (UID: \"bfd74eb8-918a-45f2-abb0-8342a3e4ebc4\") " pod="openshift-ingress-canary/ingress-canary-fmtwc" Apr 23 17:55:54.782481 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.782303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-tmp-dir\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:55:54.782481 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.782356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b5v7\" (UniqueName: \"kubernetes.io/projected/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-kube-api-access-6b5v7\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:55:54.782481 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.782401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:55:54.782481 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.782419 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-config-volume\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:55:54.883765 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.883649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5n8z\" (UniqueName: \"kubernetes.io/projected/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-kube-api-access-x5n8z\") pod \"ingress-canary-fmtwc\" (UID: \"bfd74eb8-918a-45f2-abb0-8342a3e4ebc4\") " pod="openshift-ingress-canary/ingress-canary-fmtwc" Apr 23 17:55:54.883765 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.883692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-tmp-dir\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:55:54.883765 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.883718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6b5v7\" (UniqueName: \"kubernetes.io/projected/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-kube-api-access-6b5v7\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:55:54.883985 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.883848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:55:54.883985 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.883879 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-config-volume\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:55:54.883985 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:54.883961 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:55:54.884107 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.883961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert\") pod \"ingress-canary-fmtwc\" (UID: \"bfd74eb8-918a-45f2-abb0-8342a3e4ebc4\") " pod="openshift-ingress-canary/ingress-canary-fmtwc" Apr 23 17:55:54.884107 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:54.884016 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:55:54.884107 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:54.884017 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls podName:4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:55.383999712 +0000 UTC m=+225.619446399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls") pod "dns-default-lcwv7" (UID: "4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9") : secret "dns-default-metrics-tls" not found Apr 23 17:55:54.884107 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.884075 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-tmp-dir\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:55:54.884107 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:54.884093 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert podName:bfd74eb8-918a-45f2-abb0-8342a3e4ebc4 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:55.384079387 +0000 UTC m=+225.619526082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert") pod "ingress-canary-fmtwc" (UID: "bfd74eb8-918a-45f2-abb0-8342a3e4ebc4") : secret "canary-serving-cert" not found Apr 23 17:55:54.884430 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.884415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-config-volume\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:55:54.892363 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.892339 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b5v7\" (UniqueName: \"kubernetes.io/projected/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-kube-api-access-6b5v7\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:55:54.892602 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:54.892587 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5n8z\" (UniqueName: \"kubernetes.io/projected/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-kube-api-access-x5n8z\") pod \"ingress-canary-fmtwc\" (UID: \"bfd74eb8-918a-45f2-abb0-8342a3e4ebc4\") " pod="openshift-ingress-canary/ingress-canary-fmtwc" Apr 23 17:55:55.387621 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:55.387591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:55:55.387857 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:55.387664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert\") pod \"ingress-canary-fmtwc\" (UID: \"bfd74eb8-918a-45f2-abb0-8342a3e4ebc4\") " pod="openshift-ingress-canary/ingress-canary-fmtwc" Apr 23 17:55:55.387857 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:55.387756 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:55:55.387857 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:55.387765 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:55:55.387857 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:55.387816 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls podName:4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:56.387801496 +0000 UTC m=+226.623248181 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls") pod "dns-default-lcwv7" (UID: "4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9") : secret "dns-default-metrics-tls" not found Apr 23 17:55:55.387857 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:55.387830 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert podName:bfd74eb8-918a-45f2-abb0-8342a3e4ebc4 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:56.387824638 +0000 UTC m=+226.623271323 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert") pod "ingress-canary-fmtwc" (UID: "bfd74eb8-918a-45f2-abb0-8342a3e4ebc4") : secret "canary-serving-cert" not found Apr 23 17:55:56.394159 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:56.394126 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:55:56.394537 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:56.394187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert\") pod \"ingress-canary-fmtwc\" (UID: \"bfd74eb8-918a-45f2-abb0-8342a3e4ebc4\") " pod="openshift-ingress-canary/ingress-canary-fmtwc" Apr 23 17:55:56.394537 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:56.394263 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:55:56.394537 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:56.394270 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:55:56.394537 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:56.394315 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert podName:bfd74eb8-918a-45f2-abb0-8342a3e4ebc4 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:58.394302254 +0000 UTC m=+228.629748936 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert") pod "ingress-canary-fmtwc" (UID: "bfd74eb8-918a-45f2-abb0-8342a3e4ebc4") : secret "canary-serving-cert" not found Apr 23 17:55:56.394537 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:56.394327 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls podName:4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:58.394321354 +0000 UTC m=+228.629768034 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls") pod "dns-default-lcwv7" (UID: "4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9") : secret "dns-default-metrics-tls" not found Apr 23 17:55:58.074042 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:58.074001 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-855ll"] Apr 23 17:55:58.077457 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:58.077436 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-855ll" Apr 23 17:55:58.079982 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:58.079958 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 17:55:58.080105 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:58.080016 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 17:55:58.080105 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:58.080063 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-q5c8s\"" Apr 23 17:55:58.081671 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:58.081653 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-855ll"] Apr 23 17:55:58.105654 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:58.105621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp5gs\" (UniqueName: \"kubernetes.io/projected/bcdca23a-69b4-4008-b9cd-3d1b6622c920-kube-api-access-vp5gs\") pod \"migrator-74bb7799d9-855ll\" (UID: \"bcdca23a-69b4-4008-b9cd-3d1b6622c920\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-855ll" Apr 23 17:55:58.206412 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:58.206378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vp5gs\" (UniqueName: \"kubernetes.io/projected/bcdca23a-69b4-4008-b9cd-3d1b6622c920-kube-api-access-vp5gs\") pod \"migrator-74bb7799d9-855ll\" (UID: \"bcdca23a-69b4-4008-b9cd-3d1b6622c920\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-855ll" Apr 23 17:55:58.214842 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:58.214815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp5gs\" (UniqueName: \"kubernetes.io/projected/bcdca23a-69b4-4008-b9cd-3d1b6622c920-kube-api-access-vp5gs\") pod \"migrator-74bb7799d9-855ll\" (UID: \"bcdca23a-69b4-4008-b9cd-3d1b6622c920\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-855ll" Apr 23 17:55:58.387103 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:58.387060 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-855ll" Apr 23 17:55:58.407622 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:58.407592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:55:58.407764 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:58.407656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert\") pod \"ingress-canary-fmtwc\" (UID: \"bfd74eb8-918a-45f2-abb0-8342a3e4ebc4\") " pod="openshift-ingress-canary/ingress-canary-fmtwc" Apr 23 17:55:58.407810 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:58.407757 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:55:58.407844 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:58.407818 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls podName:4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:02.407801669 +0000 UTC m=+232.643248349 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls") pod "dns-default-lcwv7" (UID: "4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9") : secret "dns-default-metrics-tls" not found Apr 23 17:55:58.407887 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:58.407758 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:55:58.407920 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:58.407910 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert podName:bfd74eb8-918a-45f2-abb0-8342a3e4ebc4 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:02.407897486 +0000 UTC m=+232.643344166 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert") pod "ingress-canary-fmtwc" (UID: "bfd74eb8-918a-45f2-abb0-8342a3e4ebc4") : secret "canary-serving-cert" not found Apr 23 17:55:58.505896 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:58.505865 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-855ll"] Apr 23 17:55:58.509221 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:55:58.509200 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcdca23a_69b4_4008_b9cd_3d1b6622c920.slice/crio-4d3a35ce6dd8db2a8c332cc3f2eebb7b983e3314591042d367333fd7fd6755c1 WatchSource:0}: Error finding container 4d3a35ce6dd8db2a8c332cc3f2eebb7b983e3314591042d367333fd7fd6755c1: Status 404 returned error can't find the container with id 4d3a35ce6dd8db2a8c332cc3f2eebb7b983e3314591042d367333fd7fd6755c1 Apr 23 17:55:58.825562 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:58.825479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-855ll" event={"ID":"bcdca23a-69b4-4008-b9cd-3d1b6622c920","Type":"ContainerStarted","Data":"4d3a35ce6dd8db2a8c332cc3f2eebb7b983e3314591042d367333fd7fd6755c1"} Apr 23 17:55:59.532404 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.532363 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wd76j"] Apr 23 17:55:59.535355 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.535340 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:55:59.537510 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.537489 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 17:55:59.538130 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.537951 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 17:55:59.538348 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.538280 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5jzrb\"" Apr 23 17:55:59.538348 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.538293 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 17:55:59.538483 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.538384 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 17:55:59.546541 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.546514 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wd76j"] Apr 23 17:55:59.617167 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.617137 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/940e0919-0fc3-4b70-81f5-5a818c8ded8c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:55:59.617328 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.617178 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/940e0919-0fc3-4b70-81f5-5a818c8ded8c-data-volume\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:55:59.617328 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.617202 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs9l6\" (UniqueName: \"kubernetes.io/projected/940e0919-0fc3-4b70-81f5-5a818c8ded8c-kube-api-access-rs9l6\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:55:59.617328 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.617278 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/940e0919-0fc3-4b70-81f5-5a818c8ded8c-crio-socket\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:55:59.617436 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.617368 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:55:59.718322 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.718289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:55:59.718416 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.718336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/940e0919-0fc3-4b70-81f5-5a818c8ded8c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:55:59.718416 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.718358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/940e0919-0fc3-4b70-81f5-5a818c8ded8c-data-volume\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:55:59.718416 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.718373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rs9l6\" (UniqueName: \"kubernetes.io/projected/940e0919-0fc3-4b70-81f5-5a818c8ded8c-kube-api-access-rs9l6\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:55:59.718524 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:59.718446 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 17:55:59.718524 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.718496 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/940e0919-0fc3-4b70-81f5-5a818c8ded8c-crio-socket\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:55:59.718524 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:55:59.718514 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls podName:940e0919-0fc3-4b70-81f5-5a818c8ded8c nodeName:}" failed. No retries permitted until 2026-04-23 17:56:00.218493899 +0000 UTC m=+230.453940581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wd76j" (UID: "940e0919-0fc3-4b70-81f5-5a818c8ded8c") : secret "insights-runtime-extractor-tls" not found Apr 23 17:55:59.718693 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.718663 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/940e0919-0fc3-4b70-81f5-5a818c8ded8c-crio-socket\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:55:59.718773 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.718734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/940e0919-0fc3-4b70-81f5-5a818c8ded8c-data-volume\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:55:59.718913 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.718898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/940e0919-0fc3-4b70-81f5-5a818c8ded8c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:55:59.727884 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.727863 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs9l6\" (UniqueName: \"kubernetes.io/projected/940e0919-0fc3-4b70-81f5-5a818c8ded8c-kube-api-access-rs9l6\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:55:59.799000 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.796509 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-n7pdd_3c4a21a3-0078-4632-bce8-ee31a63bceb2/dns-node-resolver/0.log" Apr 23 17:55:59.829256 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.829220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-855ll" event={"ID":"bcdca23a-69b4-4008-b9cd-3d1b6622c920","Type":"ContainerStarted","Data":"bff191359091d9f4164da73ac7fde5cd79f2c0eb3ca5ac727cb3905fc85fe58d"} Apr 23 17:55:59.829388 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.829263 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-855ll" event={"ID":"bcdca23a-69b4-4008-b9cd-3d1b6622c920","Type":"ContainerStarted","Data":"08b0247ec563c058127853ee14d1c74563908540e44952c6dff099b1f8deca3f"} Apr 23 17:55:59.844425 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:55:59.844383 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-855ll" podStartSLOduration=1.049365823 podStartE2EDuration="1.844369223s" podCreationTimestamp="2026-04-23 17:55:58 +0000 UTC" firstStartedPulling="2026-04-23 17:55:58.511496493 +0000 UTC m=+228.746943185" lastFinishedPulling="2026-04-23 17:55:59.306499905 +0000 UTC m=+229.541946585" observedRunningTime="2026-04-23 17:55:59.843593911 +0000 UTC m=+230.079040615" watchObservedRunningTime="2026-04-23 17:55:59.844369223 +0000 UTC m=+230.079815948" Apr 23 17:56:00.221834 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:00.221788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:56:00.222017 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:00.221941 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 17:56:00.222017 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:00.222004 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls podName:940e0919-0fc3-4b70-81f5-5a818c8ded8c nodeName:}" failed. No retries permitted until 2026-04-23 17:56:01.221985529 +0000 UTC m=+231.457432221 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wd76j" (UID: "940e0919-0fc3-4b70-81f5-5a818c8ded8c") : secret "insights-runtime-extractor-tls" not found Apr 23 17:56:00.599003 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:00.598927 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9pnhp_4c608978-9ca3-4730-81a8-ed012e4601c4/node-ca/0.log" Apr 23 17:56:01.231505 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:01.231472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:56:01.231680 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:01.231581 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 17:56:01.231680 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:01.231631 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls podName:940e0919-0fc3-4b70-81f5-5a818c8ded8c nodeName:}" failed. No retries permitted until 2026-04-23 17:56:03.231618415 +0000 UTC m=+233.467065096 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wd76j" (UID: "940e0919-0fc3-4b70-81f5-5a818c8ded8c") : secret "insights-runtime-extractor-tls" not found Apr 23 17:56:02.441250 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:02.441198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert\") pod \"ingress-canary-fmtwc\" (UID: \"bfd74eb8-918a-45f2-abb0-8342a3e4ebc4\") " pod="openshift-ingress-canary/ingress-canary-fmtwc" Apr 23 17:56:02.441656 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:02.441282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:56:02.441656 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:02.441341 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:56:02.441656 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:02.441380 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:56:02.441656 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:02.441428 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert podName:bfd74eb8-918a-45f2-abb0-8342a3e4ebc4 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:10.441412071 +0000 UTC m=+240.676858752 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert") pod "ingress-canary-fmtwc" (UID: "bfd74eb8-918a-45f2-abb0-8342a3e4ebc4") : secret "canary-serving-cert" not found Apr 23 17:56:02.441656 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:02.441442 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls podName:4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:10.441436598 +0000 UTC m=+240.676883279 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls") pod "dns-default-lcwv7" (UID: "4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9") : secret "dns-default-metrics-tls" not found Apr 23 17:56:03.247775 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:03.247724 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:56:03.247963 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:03.247866 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 17:56:03.247963 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:03.247929 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls podName:940e0919-0fc3-4b70-81f5-5a818c8ded8c nodeName:}" failed. No retries permitted until 2026-04-23 17:56:07.247913473 +0000 UTC m=+237.483360154 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wd76j" (UID: "940e0919-0fc3-4b70-81f5-5a818c8ded8c") : secret "insights-runtime-extractor-tls" not found Apr 23 17:56:07.278698 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:07.278647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:56:07.279109 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:07.278802 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 17:56:07.279109 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:07.278866 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls podName:940e0919-0fc3-4b70-81f5-5a818c8ded8c nodeName:}" failed. No retries permitted until 2026-04-23 17:56:15.278850955 +0000 UTC m=+245.514297636 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wd76j" (UID: "940e0919-0fc3-4b70-81f5-5a818c8ded8c") : secret "insights-runtime-extractor-tls" not found Apr 23 17:56:07.682498 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:07.682450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs\") pod \"network-metrics-daemon-jfhpv\" (UID: \"5baefb5e-77f1-440a-918c-82da4620b8d7\") " pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:56:07.682653 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:07.682606 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:56:07.682714 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:07.682670 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs podName:5baefb5e-77f1-440a-918c-82da4620b8d7 nodeName:}" failed. No retries permitted until 2026-04-23 17:57:11.682655604 +0000 UTC m=+301.918102290 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs") pod "network-metrics-daemon-jfhpv" (UID: "5baefb5e-77f1-440a-918c-82da4620b8d7") : secret "metrics-daemon-secret" not found Apr 23 17:56:10.503599 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:10.503546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:56:10.504008 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:10.503651 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert\") pod \"ingress-canary-fmtwc\" (UID: \"bfd74eb8-918a-45f2-abb0-8342a3e4ebc4\") " pod="openshift-ingress-canary/ingress-canary-fmtwc" Apr 23 17:56:10.506047 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:10.506027 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9-metrics-tls\") pod \"dns-default-lcwv7\" (UID: \"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9\") " pod="openshift-dns/dns-default-lcwv7" Apr 23 17:56:10.506107 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:10.506072 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfd74eb8-918a-45f2-abb0-8342a3e4ebc4-cert\") pod \"ingress-canary-fmtwc\" (UID: \"bfd74eb8-918a-45f2-abb0-8342a3e4ebc4\") " pod="openshift-ingress-canary/ingress-canary-fmtwc" Apr 23 17:56:10.589756 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:10.589660 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wh8vz\"" Apr 23 17:56:10.598317 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:10.598289 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fmtwc" Apr 23 17:56:10.604127 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:10.604104 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4gw5f\"" Apr 23 17:56:10.612806 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:10.612767 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lcwv7" Apr 23 17:56:10.725171 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:10.725142 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fmtwc"] Apr 23 17:56:10.727948 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:56:10.727914 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfd74eb8_918a_45f2_abb0_8342a3e4ebc4.slice/crio-5c70c5f2f3796be06d7d4177c7782522f07da0b8e599342a4710c10ab2ab38c6 WatchSource:0}: Error finding container 5c70c5f2f3796be06d7d4177c7782522f07da0b8e599342a4710c10ab2ab38c6: Status 404 returned error can't find the container with id 5c70c5f2f3796be06d7d4177c7782522f07da0b8e599342a4710c10ab2ab38c6 Apr 23 17:56:10.739991 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:10.739962 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lcwv7"] Apr 23 17:56:10.743195 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:56:10.743171 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ec00fc8_34ea_4af6_892d_8c8dafb8d9a9.slice/crio-c73bdd0ba11591ce3637e015cfa3a5ca7db336d594f97e4290cb31b73af972d1 WatchSource:0}: Error finding container c73bdd0ba11591ce3637e015cfa3a5ca7db336d594f97e4290cb31b73af972d1: Status 404 returned error can't find the container with id c73bdd0ba11591ce3637e015cfa3a5ca7db336d594f97e4290cb31b73af972d1 Apr 23 17:56:10.849055 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:10.848957 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lcwv7" event={"ID":"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9","Type":"ContainerStarted","Data":"c73bdd0ba11591ce3637e015cfa3a5ca7db336d594f97e4290cb31b73af972d1"} Apr 23 17:56:10.849784 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:10.849762 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fmtwc" event={"ID":"bfd74eb8-918a-45f2-abb0-8342a3e4ebc4","Type":"ContainerStarted","Data":"5c70c5f2f3796be06d7d4177c7782522f07da0b8e599342a4710c10ab2ab38c6"} Apr 23 17:56:12.855070 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:12.854977 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lcwv7" event={"ID":"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9","Type":"ContainerStarted","Data":"5f96d07a1d4e8824f7d3eb7387916eca4831cb62d41c51359ad3eafe71a15c08"} Apr 23 17:56:12.855070 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:12.855017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lcwv7" event={"ID":"4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9","Type":"ContainerStarted","Data":"5d8164cf5b6e47d0058cdcd549c98c042e9b96eec32ced8f0b22ad118b88b719"} Apr 23 17:56:12.855522 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:12.855088 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lcwv7" Apr 23 17:56:12.856288 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:12.856267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fmtwc" event={"ID":"bfd74eb8-918a-45f2-abb0-8342a3e4ebc4","Type":"ContainerStarted","Data":"3d834494405b318b9a5deca2605d327cd5680daa0e1e26bf13ba1233af049eb5"} Apr 23 17:56:12.869118 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:12.869078 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lcwv7" podStartSLOduration=17.128901609 podStartE2EDuration="18.869064401s" podCreationTimestamp="2026-04-23 17:55:54 +0000 UTC" firstStartedPulling="2026-04-23 17:56:10.74493585 +0000 UTC m=+240.980382531" lastFinishedPulling="2026-04-23 17:56:12.485098642 +0000 UTC m=+242.720545323" observedRunningTime="2026-04-23 17:56:12.868457498 +0000 UTC m=+243.103904212" watchObservedRunningTime="2026-04-23 17:56:12.869064401 +0000 UTC m=+243.104511104" Apr 23 17:56:12.880522 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:12.880482 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fmtwc" podStartSLOduration=17.121715407 podStartE2EDuration="18.880469352s" podCreationTimestamp="2026-04-23 17:55:54 +0000 UTC" firstStartedPulling="2026-04-23 17:56:10.729841289 +0000 UTC m=+240.965287971" lastFinishedPulling="2026-04-23 17:56:12.488595034 +0000 UTC m=+242.724041916" observedRunningTime="2026-04-23 17:56:12.880311706 +0000 UTC m=+243.115758409" watchObservedRunningTime="2026-04-23 17:56:12.880469352 +0000 UTC m=+243.115916053" Apr 23 17:56:15.342896 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:15.342835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:56:15.345138 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:15.345116 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/940e0919-0fc3-4b70-81f5-5a818c8ded8c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wd76j\" (UID: \"940e0919-0fc3-4b70-81f5-5a818c8ded8c\") " pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:56:15.445988 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:15.445956 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5jzrb\"" Apr 23 17:56:15.454815 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:15.454795 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wd76j" Apr 23 17:56:15.586110 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:15.586075 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wd76j"] Apr 23 17:56:15.589980 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:56:15.589944 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod940e0919_0fc3_4b70_81f5_5a818c8ded8c.slice/crio-7643c0d559eae0cd0f25c506e556a4d9d69882c7c21c22219ae40af6ac5d3c77 WatchSource:0}: Error finding container 7643c0d559eae0cd0f25c506e556a4d9d69882c7c21c22219ae40af6ac5d3c77: Status 404 returned error can't find the container with id 7643c0d559eae0cd0f25c506e556a4d9d69882c7c21c22219ae40af6ac5d3c77 Apr 23 17:56:15.864851 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:15.864820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wd76j" event={"ID":"940e0919-0fc3-4b70-81f5-5a818c8ded8c","Type":"ContainerStarted","Data":"75b53af552216f65cdf14f737a3b3094d83026725f438634fe445146c5ef0ac5"} Apr 23 17:56:15.865009 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:15.864857 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wd76j" event={"ID":"940e0919-0fc3-4b70-81f5-5a818c8ded8c","Type":"ContainerStarted","Data":"7643c0d559eae0cd0f25c506e556a4d9d69882c7c21c22219ae40af6ac5d3c77"} Apr 23 17:56:16.869542 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:16.869500 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wd76j" event={"ID":"940e0919-0fc3-4b70-81f5-5a818c8ded8c","Type":"ContainerStarted","Data":"d38edecf03e09a4f2032adcc6cfc0d7c3cd99a7d79be840c2227920540c152dc"} Apr 23 17:56:17.812600 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:17.812572 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tph9h" Apr 23 17:56:17.874960 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:17.874920 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wd76j" event={"ID":"940e0919-0fc3-4b70-81f5-5a818c8ded8c","Type":"ContainerStarted","Data":"ca4bdbc5195dee67d21ff0decc9c247b00f88faf6178e4a775752ebfd14ce30f"} Apr 23 17:56:17.891457 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:17.891406 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wd76j" podStartSLOduration=16.919894275 podStartE2EDuration="18.891393604s" podCreationTimestamp="2026-04-23 17:55:59 +0000 UTC" firstStartedPulling="2026-04-23 17:56:15.643164245 +0000 UTC m=+245.878610926" lastFinishedPulling="2026-04-23 17:56:17.614663568 +0000 UTC m=+247.850110255" observedRunningTime="2026-04-23 17:56:17.890604479 +0000 UTC m=+248.126051188" watchObservedRunningTime="2026-04-23 17:56:17.891393604 +0000 UTC m=+248.126840307" Apr 23 17:56:20.582703 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:20.582659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stggz\" (UniqueName: \"kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz\") pod \"network-check-target-x77gx\" (UID: \"41ba5b02-a248-4259-8ca2-8f501349c1b3\") " pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:56:20.585100 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:20.585083 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:56:20.595643 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:20.595622 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:56:20.606832 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:20.606800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stggz\" (UniqueName: \"kubernetes.io/projected/41ba5b02-a248-4259-8ca2-8f501349c1b3-kube-api-access-stggz\") pod \"network-check-target-x77gx\" (UID: \"41ba5b02-a248-4259-8ca2-8f501349c1b3\") " pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:56:20.806675 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:20.806644 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-brp64\"" Apr 23 17:56:20.814634 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:20.814611 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:56:20.925260 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:20.925230 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x77gx"] Apr 23 17:56:20.927828 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:56:20.927801 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41ba5b02_a248_4259_8ca2_8f501349c1b3.slice/crio-7fe6e709d629b57fc5e6a53c6a1caacadcd3af0fca88361d9dfd99e89990d3b5 WatchSource:0}: Error finding container 7fe6e709d629b57fc5e6a53c6a1caacadcd3af0fca88361d9dfd99e89990d3b5: Status 404 returned error can't find the container with id 7fe6e709d629b57fc5e6a53c6a1caacadcd3af0fca88361d9dfd99e89990d3b5 Apr 23 17:56:21.888140 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:21.888106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x77gx" event={"ID":"41ba5b02-a248-4259-8ca2-8f501349c1b3","Type":"ContainerStarted","Data":"7fe6e709d629b57fc5e6a53c6a1caacadcd3af0fca88361d9dfd99e89990d3b5"} Apr 23 17:56:22.289453 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.289374 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7d6947dcbc-j7jjm"] Apr 23 17:56:22.292493 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.292470 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.294849 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.294810 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 17:56:22.295010 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.294908 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 17:56:22.295912 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.295878 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 17:56:22.295912 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.295907 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-f7p8x\"" Apr 23 17:56:22.302377 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.302324 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 17:56:22.303686 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.303665 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7d6947dcbc-j7jjm"] Apr 23 17:56:22.398411 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.398381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0316b20-9d56-4972-9cdf-d2acf03e0921-installation-pull-secrets\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.398411 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.398416 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0316b20-9d56-4972-9cdf-d2acf03e0921-registry-certificates\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.398645 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.398476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbvnm\" (UniqueName: \"kubernetes.io/projected/a0316b20-9d56-4972-9cdf-d2acf03e0921-kube-api-access-xbvnm\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.398645 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.398530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a0316b20-9d56-4972-9cdf-d2acf03e0921-image-registry-private-configuration\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.398645 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.398549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0316b20-9d56-4972-9cdf-d2acf03e0921-ca-trust-extracted\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.398645 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.398575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0316b20-9d56-4972-9cdf-d2acf03e0921-bound-sa-token\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.398645 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.398601 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0316b20-9d56-4972-9cdf-d2acf03e0921-trusted-ca\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.398873 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.398659 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0316b20-9d56-4972-9cdf-d2acf03e0921-registry-tls\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.499912 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.499875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0316b20-9d56-4972-9cdf-d2acf03e0921-registry-tls\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.500081 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.499924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0316b20-9d56-4972-9cdf-d2acf03e0921-installation-pull-secrets\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.500081 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.499955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0316b20-9d56-4972-9cdf-d2acf03e0921-registry-certificates\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.500081 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.500011 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbvnm\" (UniqueName: \"kubernetes.io/projected/a0316b20-9d56-4972-9cdf-d2acf03e0921-kube-api-access-xbvnm\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.500081 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.500056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a0316b20-9d56-4972-9cdf-d2acf03e0921-image-registry-private-configuration\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.500296 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.500083 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0316b20-9d56-4972-9cdf-d2acf03e0921-ca-trust-extracted\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.500296 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.500105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0316b20-9d56-4972-9cdf-d2acf03e0921-bound-sa-token\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.500296 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.500135 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0316b20-9d56-4972-9cdf-d2acf03e0921-trusted-ca\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.500617 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.500590 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0316b20-9d56-4972-9cdf-d2acf03e0921-ca-trust-extracted\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.501122 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.501095 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0316b20-9d56-4972-9cdf-d2acf03e0921-registry-certificates\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.501885 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.501861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0316b20-9d56-4972-9cdf-d2acf03e0921-trusted-ca\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.503457 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.503433 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a0316b20-9d56-4972-9cdf-d2acf03e0921-image-registry-private-configuration\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.503559 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.503519 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0316b20-9d56-4972-9cdf-d2acf03e0921-installation-pull-secrets\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.503559 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.503534 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0316b20-9d56-4972-9cdf-d2acf03e0921-registry-tls\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.509201 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.509174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0316b20-9d56-4972-9cdf-d2acf03e0921-bound-sa-token\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.509427 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.509401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbvnm\" (UniqueName: \"kubernetes.io/projected/a0316b20-9d56-4972-9cdf-d2acf03e0921-kube-api-access-xbvnm\") pod \"image-registry-7d6947dcbc-j7jjm\" (UID: \"a0316b20-9d56-4972-9cdf-d2acf03e0921\") " pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.603876 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.603795 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:22.860822 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:22.860753 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lcwv7" Apr 23 17:56:23.605678 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:23.605648 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7d6947dcbc-j7jjm"] Apr 23 17:56:23.608945 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:56:23.608915 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0316b20_9d56_4972_9cdf_d2acf03e0921.slice/crio-bd2374ec237f3c15949fcb6fb47ba462b83e1885ae619aaa5c0c5f6f8f28316f WatchSource:0}: Error finding container bd2374ec237f3c15949fcb6fb47ba462b83e1885ae619aaa5c0c5f6f8f28316f: Status 404 returned error can't find the container with id bd2374ec237f3c15949fcb6fb47ba462b83e1885ae619aaa5c0c5f6f8f28316f Apr 23 17:56:23.894574 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:23.894537 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" event={"ID":"a0316b20-9d56-4972-9cdf-d2acf03e0921","Type":"ContainerStarted","Data":"54f1b5698dbe6afa09468f851d414b06cf9f0ad67339fa540200ab0cadb83dd2"} Apr 23 17:56:23.894574 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:23.894576 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" event={"ID":"a0316b20-9d56-4972-9cdf-d2acf03e0921","Type":"ContainerStarted","Data":"bd2374ec237f3c15949fcb6fb47ba462b83e1885ae619aaa5c0c5f6f8f28316f"} Apr 23 17:56:23.894887 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:23.894629 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:23.898150 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:23.898125 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x77gx" event={"ID":"41ba5b02-a248-4259-8ca2-8f501349c1b3","Type":"ContainerStarted","Data":"2583101d9464314f17baca62946cfcd249e4ba78102c834d155646970c91d0f8"} Apr 23 17:56:23.898268 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:23.898246 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:56:23.910492 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:23.910453 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" podStartSLOduration=1.91043812 podStartE2EDuration="1.91043812s" podCreationTimestamp="2026-04-23 17:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:56:23.909166973 +0000 UTC m=+254.144613676" watchObservedRunningTime="2026-04-23 17:56:23.91043812 +0000 UTC m=+254.145884823" Apr 23 17:56:23.920800 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:23.920757 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-x77gx" podStartSLOduration=65.301842278 podStartE2EDuration="1m7.920727593s" podCreationTimestamp="2026-04-23 17:55:16 +0000 UTC" firstStartedPulling="2026-04-23 17:56:20.929789582 +0000 UTC m=+251.165236266" lastFinishedPulling="2026-04-23 17:56:23.5486749 +0000 UTC m=+253.784121581" observedRunningTime="2026-04-23 17:56:23.920133445 +0000 UTC m=+254.155580150" watchObservedRunningTime="2026-04-23 17:56:23.920727593 +0000 UTC m=+254.156174289" Apr 23 17:56:32.060535 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.060500 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cpw42"] Apr 23 17:56:32.065911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.065891 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.069260 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.069232 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 17:56:32.069455 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.069375 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qddvv\"" Apr 23 17:56:32.069529 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.069237 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 17:56:32.069844 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.069826 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 17:56:32.069965 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.069941 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 17:56:32.070037 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.070023 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 17:56:32.070429 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.070405 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 17:56:32.175649 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.175619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-node-exporter-accelerators-collector-config\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.175867 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.175670 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-node-exporter-textfile\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.175867 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.175718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-root\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.175867 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.175764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmm6h\" (UniqueName: \"kubernetes.io/projected/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-kube-api-access-kmm6h\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.175867 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.175826 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.176057 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.175910 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-node-exporter-wtmp\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.176057 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.175952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-metrics-client-ca\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.176057 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.175983 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-sys\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.176057 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.176007 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-node-exporter-tls\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.277108 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.277075 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-root\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.277260 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.277117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmm6h\" (UniqueName: \"kubernetes.io/projected/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-kube-api-access-kmm6h\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.277260 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.277180 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-root\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.277351 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.277264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.277351 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.277304 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-node-exporter-wtmp\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.277351 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.277330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-metrics-client-ca\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.277500 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.277379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-sys\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.277500 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.277424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-node-exporter-tls\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.277500 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.277437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-node-exporter-wtmp\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.277644 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.277500 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-sys\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.277644 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.277511 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-node-exporter-accelerators-collector-config\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.277644 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.277573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-node-exporter-textfile\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.277893 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.277864 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-node-exporter-textfile\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.277997 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.277981 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-metrics-client-ca\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.278117 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.278098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-node-exporter-accelerators-collector-config\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.279770 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.279727 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.279971 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.279953 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-node-exporter-tls\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.284770 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.284730 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmm6h\" (UniqueName: \"kubernetes.io/projected/ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9-kube-api-access-kmm6h\") pod \"node-exporter-cpw42\" (UID: \"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9\") " pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.378606 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.378575 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cpw42" Apr 23 17:56:32.387417 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:56:32.387391 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecf8cb7c_9030_427c_bcf5_3a814bf8d6d9.slice/crio-3938464b306862197f35bd0d5cd292234276b4fe192b3a1228f39daf644b36aa WatchSource:0}: Error finding container 3938464b306862197f35bd0d5cd292234276b4fe192b3a1228f39daf644b36aa: Status 404 returned error can't find the container with id 3938464b306862197f35bd0d5cd292234276b4fe192b3a1228f39daf644b36aa Apr 23 17:56:32.920594 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:32.920550 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cpw42" event={"ID":"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9","Type":"ContainerStarted","Data":"3938464b306862197f35bd0d5cd292234276b4fe192b3a1228f39daf644b36aa"} Apr 23 17:56:33.159928 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.159904 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:56:33.163181 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.163165 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.165164 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.165143 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 17:56:33.165324 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.165191 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 17:56:33.165324 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.165228 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 17:56:33.165566 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.165537 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-dv2fr\"" Apr 23 17:56:33.165620 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.165591 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 17:56:33.166247 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.166012 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 17:56:33.166247 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.166038 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 17:56:33.166247 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.166083 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 17:56:33.166247 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.166112 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 17:56:33.166247 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.166136 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 17:56:33.193717 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.193658 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:56:33.285579 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.285544 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-config-volume\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.285717 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.285582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/392b13b6-7939-4aac-8409-2fcb938f87a3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.285717 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.285608 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.285717 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.285633 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jtb4\" (UniqueName: \"kubernetes.io/projected/392b13b6-7939-4aac-8409-2fcb938f87a3-kube-api-access-4jtb4\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.285717 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.285668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/392b13b6-7939-4aac-8409-2fcb938f87a3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.285717 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.285690 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.285717 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.285714 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-web-config\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.285949 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.285759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/392b13b6-7939-4aac-8409-2fcb938f87a3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.285949 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.285790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.285949 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.285810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/392b13b6-7939-4aac-8409-2fcb938f87a3-config-out\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.285949 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.285840 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/392b13b6-7939-4aac-8409-2fcb938f87a3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.285949 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.285885 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.285949 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.285901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.387101 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.387067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/392b13b6-7939-4aac-8409-2fcb938f87a3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.387278 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.387121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.387278 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.387141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.387278 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.387161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-config-volume\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.387278 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.387178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/392b13b6-7939-4aac-8409-2fcb938f87a3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.387278 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.387194 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.387278 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.387212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jtb4\" (UniqueName: \"kubernetes.io/projected/392b13b6-7939-4aac-8409-2fcb938f87a3-kube-api-access-4jtb4\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.387278 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.387242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/392b13b6-7939-4aac-8409-2fcb938f87a3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.387619 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.387274 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.387619 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.387318 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-web-config\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.387619 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.387348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/392b13b6-7939-4aac-8409-2fcb938f87a3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.387619 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.387379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.387619 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.387406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/392b13b6-7939-4aac-8409-2fcb938f87a3-config-out\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.387619 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.387498 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/392b13b6-7939-4aac-8409-2fcb938f87a3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.388568 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.388505 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/392b13b6-7939-4aac-8409-2fcb938f87a3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.388568 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.388540 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/392b13b6-7939-4aac-8409-2fcb938f87a3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.390043 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.390001 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/392b13b6-7939-4aac-8409-2fcb938f87a3-config-out\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.390386 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.390350 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.390680 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.390634 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/392b13b6-7939-4aac-8409-2fcb938f87a3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.390792 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.390675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.391098 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.391073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-config-volume\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.391185 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.391116 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.391185 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.391136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.391327 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.391308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-web-config\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.392181 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.392158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.395201 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.395181 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jtb4\" (UniqueName: \"kubernetes.io/projected/392b13b6-7939-4aac-8409-2fcb938f87a3-kube-api-access-4jtb4\") pod \"alertmanager-main-0\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.472654 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.472571 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:33.604398 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.604363 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:56:33.606948 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:56:33.606921 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod392b13b6_7939_4aac_8409_2fcb938f87a3.slice/crio-3c875c0491788610b9dd619df49115ea7f5763b9b6ddb37ccab21f50dae35bf3 WatchSource:0}: Error finding container 3c875c0491788610b9dd619df49115ea7f5763b9b6ddb37ccab21f50dae35bf3: Status 404 returned error can't find the container with id 3c875c0491788610b9dd619df49115ea7f5763b9b6ddb37ccab21f50dae35bf3 Apr 23 17:56:33.924773 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.924725 2576 generic.go:358] "Generic (PLEG): container finished" podID="ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9" containerID="2d762cdc56d20f6bfcc96d38010ac13987cf72672337c7f25cb014b91630653b" exitCode=0 Apr 23 17:56:33.924977 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.924813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cpw42" event={"ID":"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9","Type":"ContainerDied","Data":"2d762cdc56d20f6bfcc96d38010ac13987cf72672337c7f25cb014b91630653b"} Apr 23 17:56:33.925981 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:33.925952 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"392b13b6-7939-4aac-8409-2fcb938f87a3","Type":"ContainerStarted","Data":"3c875c0491788610b9dd619df49115ea7f5763b9b6ddb37ccab21f50dae35bf3"} Apr 23 17:56:34.801464 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:34.801373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:56:34.804111 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:34.804089 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 17:56:34.813794 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:34.813766 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c10ccf97-5e76-4972-b775-25d5b2e5a887-original-pull-secret\") pod \"global-pull-secret-syncer-q7mhh\" (UID: \"c10ccf97-5e76-4972-b775-25d5b2e5a887\") " pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:56:34.930932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:34.930889 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cpw42" event={"ID":"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9","Type":"ContainerStarted","Data":"66ed00514f762c372f020483a24cafb24bcd08a67289976147a76b5d0b9cecae"} Apr 23 17:56:34.930932 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:34.930932 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cpw42" event={"ID":"ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9","Type":"ContainerStarted","Data":"093d31e047d8fb84569b9f42acacdf29f50690e803e35774a1d2f149268ad90f"} Apr 23 17:56:34.932292 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:34.932258 2576 generic.go:358] "Generic (PLEG): container finished" podID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerID="cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8" exitCode=0 Apr 23 17:56:34.932409 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:34.932321 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"392b13b6-7939-4aac-8409-2fcb938f87a3","Type":"ContainerDied","Data":"cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8"} Apr 23 17:56:34.950547 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:34.950503 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cpw42" podStartSLOduration=2.253345984 podStartE2EDuration="2.950490972s" podCreationTimestamp="2026-04-23 17:56:32 +0000 UTC" firstStartedPulling="2026-04-23 17:56:32.389280943 +0000 UTC m=+262.624727624" lastFinishedPulling="2026-04-23 17:56:33.08642592 +0000 UTC m=+263.321872612" observedRunningTime="2026-04-23 17:56:34.949819308 +0000 UTC m=+265.185266011" watchObservedRunningTime="2026-04-23 17:56:34.950490972 +0000 UTC m=+265.185937675" Apr 23 17:56:35.099138 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:35.099066 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q7mhh" Apr 23 17:56:35.212201 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:35.212162 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q7mhh"] Apr 23 17:56:35.215016 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:56:35.214990 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc10ccf97_5e76_4972_b775_25d5b2e5a887.slice/crio-a0a016caf874cbb4642b3adb399f0ec1007cc59fcd65d23eb839070ab9e672da WatchSource:0}: Error finding container a0a016caf874cbb4642b3adb399f0ec1007cc59fcd65d23eb839070ab9e672da: Status 404 returned error can't find the container with id a0a016caf874cbb4642b3adb399f0ec1007cc59fcd65d23eb839070ab9e672da Apr 23 17:56:35.382805 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:35.382774 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-8dtjc"] Apr 23 17:56:35.385807 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:35.385788 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-8dtjc" Apr 23 17:56:35.388516 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:35.388487 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-dhvk6\"" Apr 23 17:56:35.388642 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:35.388563 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 17:56:35.390691 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:35.390674 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 17:56:35.398857 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:35.398830 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-8dtjc"] Apr 23 17:56:35.507880 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:35.507840 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4hrx\" (UniqueName: \"kubernetes.io/projected/2028d82d-64c8-4897-a6c2-1adb482b3e8d-kube-api-access-k4hrx\") pod \"downloads-6bcc868b7-8dtjc\" (UID: \"2028d82d-64c8-4897-a6c2-1adb482b3e8d\") " pod="openshift-console/downloads-6bcc868b7-8dtjc" Apr 23 17:56:35.609549 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:35.609148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4hrx\" (UniqueName: \"kubernetes.io/projected/2028d82d-64c8-4897-a6c2-1adb482b3e8d-kube-api-access-k4hrx\") pod \"downloads-6bcc868b7-8dtjc\" (UID: \"2028d82d-64c8-4897-a6c2-1adb482b3e8d\") " pod="openshift-console/downloads-6bcc868b7-8dtjc" Apr 23 17:56:35.623578 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:35.623542 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4hrx\" (UniqueName: \"kubernetes.io/projected/2028d82d-64c8-4897-a6c2-1adb482b3e8d-kube-api-access-k4hrx\") pod \"downloads-6bcc868b7-8dtjc\" (UID: \"2028d82d-64c8-4897-a6c2-1adb482b3e8d\") " pod="openshift-console/downloads-6bcc868b7-8dtjc" Apr 23 17:56:35.697276 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:35.697191 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-8dtjc" Apr 23 17:56:35.853058 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:35.853024 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-8dtjc"] Apr 23 17:56:35.937031 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:35.936994 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q7mhh" event={"ID":"c10ccf97-5e76-4972-b775-25d5b2e5a887","Type":"ContainerStarted","Data":"a0a016caf874cbb4642b3adb399f0ec1007cc59fcd65d23eb839070ab9e672da"} Apr 23 17:56:36.221384 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:56:36.221352 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2028d82d_64c8_4897_a6c2_1adb482b3e8d.slice/crio-2e38ff8ae1e99beba2d5caad7bb005793d37183ca5b5bef8581a176b730c3d88 WatchSource:0}: Error finding container 2e38ff8ae1e99beba2d5caad7bb005793d37183ca5b5bef8581a176b730c3d88: Status 404 returned error can't find the container with id 2e38ff8ae1e99beba2d5caad7bb005793d37183ca5b5bef8581a176b730c3d88 Apr 23 17:56:36.378160 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.378124 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-594fb98f6c-rmldp"] Apr 23 17:56:36.382406 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.382205 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.385027 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.384950 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 17:56:36.385027 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.384986 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 17:56:36.385027 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.384993 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 17:56:36.385522 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.385477 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-h9s8x\"" Apr 23 17:56:36.385625 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.385545 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 17:56:36.385625 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.385591 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-biuupntb26q1f\"" Apr 23 17:56:36.394442 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.394409 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-594fb98f6c-rmldp"] Apr 23 17:56:36.516365 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.516339 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a750ac45-1a8d-4704-9fcb-4701164f2bd7-secret-metrics-server-client-certs\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.516491 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.516378 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hfhr\" (UniqueName: \"kubernetes.io/projected/a750ac45-1a8d-4704-9fcb-4701164f2bd7-kube-api-access-2hfhr\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.516491 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.516414 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a750ac45-1a8d-4704-9fcb-4701164f2bd7-audit-log\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.516491 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.516441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a750ac45-1a8d-4704-9fcb-4701164f2bd7-metrics-server-audit-profiles\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.516491 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.516473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a750ac45-1a8d-4704-9fcb-4701164f2bd7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.516703 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.516494 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a750ac45-1a8d-4704-9fcb-4701164f2bd7-secret-metrics-server-tls\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.516703 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.516528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a750ac45-1a8d-4704-9fcb-4701164f2bd7-client-ca-bundle\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.617263 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.617241 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a750ac45-1a8d-4704-9fcb-4701164f2bd7-secret-metrics-server-tls\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.617375 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.617280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a750ac45-1a8d-4704-9fcb-4701164f2bd7-client-ca-bundle\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.617375 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.617346 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a750ac45-1a8d-4704-9fcb-4701164f2bd7-secret-metrics-server-client-certs\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.617375 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.617368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hfhr\" (UniqueName: \"kubernetes.io/projected/a750ac45-1a8d-4704-9fcb-4701164f2bd7-kube-api-access-2hfhr\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.617535 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.617402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a750ac45-1a8d-4704-9fcb-4701164f2bd7-audit-log\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.617535 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.617429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a750ac45-1a8d-4704-9fcb-4701164f2bd7-metrics-server-audit-profiles\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.617535 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.617461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a750ac45-1a8d-4704-9fcb-4701164f2bd7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.618019 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.617991 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a750ac45-1a8d-4704-9fcb-4701164f2bd7-audit-log\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.618548 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.618530 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a750ac45-1a8d-4704-9fcb-4701164f2bd7-metrics-server-audit-profiles\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.618708 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.618671 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a750ac45-1a8d-4704-9fcb-4701164f2bd7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.620072 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.620041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a750ac45-1a8d-4704-9fcb-4701164f2bd7-client-ca-bundle\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.620362 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.620341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a750ac45-1a8d-4704-9fcb-4701164f2bd7-secret-metrics-server-client-certs\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.620548 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.620499 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a750ac45-1a8d-4704-9fcb-4701164f2bd7-secret-metrics-server-tls\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.631479 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.631436 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hfhr\" (UniqueName: \"kubernetes.io/projected/a750ac45-1a8d-4704-9fcb-4701164f2bd7-kube-api-access-2hfhr\") pod \"metrics-server-594fb98f6c-rmldp\" (UID: \"a750ac45-1a8d-4704-9fcb-4701164f2bd7\") " pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.716072 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.716005 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:36.827193 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.827111 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-ppx75"] Apr 23 17:56:36.831395 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.831368 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ppx75" Apr 23 17:56:36.833864 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.833840 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-7tjt8\"" Apr 23 17:56:36.834168 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.833990 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 17:56:36.840104 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.840076 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-ppx75"] Apr 23 17:56:36.856571 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.856544 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-594fb98f6c-rmldp"] Apr 23 17:56:36.860696 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:56:36.860666 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda750ac45_1a8d_4704_9fcb_4701164f2bd7.slice/crio-5ab10609b2303b9fa0a8e7430d7c1681a38ed97499e8d427574b0b8fd78c9f33 WatchSource:0}: Error finding container 5ab10609b2303b9fa0a8e7430d7c1681a38ed97499e8d427574b0b8fd78c9f33: Status 404 returned error can't find the container with id 5ab10609b2303b9fa0a8e7430d7c1681a38ed97499e8d427574b0b8fd78c9f33 Apr 23 17:56:36.920452 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.920403 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/df7cb1b7-966f-446d-8632-851efad07ab1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-ppx75\" (UID: \"df7cb1b7-966f-446d-8632-851efad07ab1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ppx75" Apr 23 17:56:36.943827 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.943709 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"392b13b6-7939-4aac-8409-2fcb938f87a3","Type":"ContainerStarted","Data":"6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf"} Apr 23 17:56:36.943827 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.943777 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"392b13b6-7939-4aac-8409-2fcb938f87a3","Type":"ContainerStarted","Data":"0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c"} Apr 23 17:56:36.943827 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.943791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"392b13b6-7939-4aac-8409-2fcb938f87a3","Type":"ContainerStarted","Data":"9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5"} Apr 23 17:56:36.943827 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.943800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"392b13b6-7939-4aac-8409-2fcb938f87a3","Type":"ContainerStarted","Data":"c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d"} Apr 23 17:56:36.943827 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.943809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"392b13b6-7939-4aac-8409-2fcb938f87a3","Type":"ContainerStarted","Data":"f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7"} Apr 23 17:56:36.944993 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.944965 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-8dtjc" event={"ID":"2028d82d-64c8-4897-a6c2-1adb482b3e8d","Type":"ContainerStarted","Data":"2e38ff8ae1e99beba2d5caad7bb005793d37183ca5b5bef8581a176b730c3d88"} Apr 23 17:56:36.946073 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:36.946048 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" event={"ID":"a750ac45-1a8d-4704-9fcb-4701164f2bd7","Type":"ContainerStarted","Data":"5ab10609b2303b9fa0a8e7430d7c1681a38ed97499e8d427574b0b8fd78c9f33"} Apr 23 17:56:37.022000 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:37.021840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/df7cb1b7-966f-446d-8632-851efad07ab1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-ppx75\" (UID: \"df7cb1b7-966f-446d-8632-851efad07ab1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ppx75" Apr 23 17:56:37.022000 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:37.021972 2576 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 23 17:56:37.022195 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:56:37.022042 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df7cb1b7-966f-446d-8632-851efad07ab1-monitoring-plugin-cert podName:df7cb1b7-966f-446d-8632-851efad07ab1 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:37.52202475 +0000 UTC m=+267.757471434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/df7cb1b7-966f-446d-8632-851efad07ab1-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-ppx75" (UID: "df7cb1b7-966f-446d-8632-851efad07ab1") : secret "monitoring-plugin-cert" not found Apr 23 17:56:37.526001 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:37.525935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/df7cb1b7-966f-446d-8632-851efad07ab1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-ppx75\" (UID: \"df7cb1b7-966f-446d-8632-851efad07ab1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ppx75" Apr 23 17:56:37.538516 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:37.538483 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/df7cb1b7-966f-446d-8632-851efad07ab1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-ppx75\" (UID: \"df7cb1b7-966f-446d-8632-851efad07ab1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ppx75" Apr 23 17:56:37.748006 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:37.747972 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ppx75" Apr 23 17:56:37.954089 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:37.954056 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"392b13b6-7939-4aac-8409-2fcb938f87a3","Type":"ContainerStarted","Data":"a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442"} Apr 23 17:56:37.984425 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:37.984216 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.233914833 podStartE2EDuration="4.984195539s" podCreationTimestamp="2026-04-23 17:56:33 +0000 UTC" firstStartedPulling="2026-04-23 17:56:33.608677391 +0000 UTC m=+263.844124072" lastFinishedPulling="2026-04-23 17:56:37.358958095 +0000 UTC m=+267.594404778" observedRunningTime="2026-04-23 17:56:37.982231066 +0000 UTC m=+268.217677794" watchObservedRunningTime="2026-04-23 17:56:37.984195539 +0000 UTC m=+268.219642245" Apr 23 17:56:40.015847 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:40.015793 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-ppx75"] Apr 23 17:56:40.019120 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:56:40.019092 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf7cb1b7_966f_446d_8632_851efad07ab1.slice/crio-6f15c6436e898ea0eae41144b32de889c6b4556adc529e856c5d1b12c8d98aa6 WatchSource:0}: Error finding container 6f15c6436e898ea0eae41144b32de889c6b4556adc529e856c5d1b12c8d98aa6: Status 404 returned error can't find the container with id 6f15c6436e898ea0eae41144b32de889c6b4556adc529e856c5d1b12c8d98aa6 Apr 23 17:56:40.974306 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:40.974269 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q7mhh" event={"ID":"c10ccf97-5e76-4972-b775-25d5b2e5a887","Type":"ContainerStarted","Data":"129de33f252c49614c78284a1fb6cece91d3a0a7349db784d1526dbd9d0ed778"} Apr 23 17:56:40.975575 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:40.975537 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ppx75" event={"ID":"df7cb1b7-966f-446d-8632-851efad07ab1","Type":"ContainerStarted","Data":"6f15c6436e898ea0eae41144b32de889c6b4556adc529e856c5d1b12c8d98aa6"} Apr 23 17:56:40.977040 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:40.977014 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" event={"ID":"a750ac45-1a8d-4704-9fcb-4701164f2bd7","Type":"ContainerStarted","Data":"a8ec189b4b9b1a129fac574cfc5fe799b4f2c05fd542925ea2d80e690bdeaa1b"} Apr 23 17:56:40.991300 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:40.991241 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-q7mhh" podStartSLOduration=252.3575698 podStartE2EDuration="4m16.991225222s" podCreationTimestamp="2026-04-23 17:52:24 +0000 UTC" firstStartedPulling="2026-04-23 17:56:35.216858962 +0000 UTC m=+265.452305644" lastFinishedPulling="2026-04-23 17:56:39.850514384 +0000 UTC m=+270.085961066" observedRunningTime="2026-04-23 17:56:40.990664519 +0000 UTC m=+271.226111267" watchObservedRunningTime="2026-04-23 17:56:40.991225222 +0000 UTC m=+271.226671926" Apr 23 17:56:41.011716 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:41.011672 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" podStartSLOduration=2.030064501 podStartE2EDuration="5.011656628s" podCreationTimestamp="2026-04-23 17:56:36 +0000 UTC" firstStartedPulling="2026-04-23 17:56:36.863550759 +0000 UTC m=+267.098997440" lastFinishedPulling="2026-04-23 17:56:39.845142884 +0000 UTC m=+270.080589567" observedRunningTime="2026-04-23 17:56:41.011430768 +0000 UTC m=+271.246877473" watchObservedRunningTime="2026-04-23 17:56:41.011656628 +0000 UTC m=+271.247103336" Apr 23 17:56:41.982140 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:41.981912 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ppx75" event={"ID":"df7cb1b7-966f-446d-8632-851efad07ab1","Type":"ContainerStarted","Data":"76e1587b2d3fcc5681b175541b1e36a553f8cb2a1fc8439252e8fc1a35f392b0"} Apr 23 17:56:41.982533 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:41.982422 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ppx75" Apr 23 17:56:41.987715 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:41.987691 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ppx75" Apr 23 17:56:42.001002 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:42.000952 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ppx75" podStartSLOduration=4.664862913 podStartE2EDuration="6.000939446s" podCreationTimestamp="2026-04-23 17:56:36 +0000 UTC" firstStartedPulling="2026-04-23 17:56:40.025157094 +0000 UTC m=+270.260603783" lastFinishedPulling="2026-04-23 17:56:41.36123362 +0000 UTC m=+271.596680316" observedRunningTime="2026-04-23 17:56:41.999853373 +0000 UTC m=+272.235300078" watchObservedRunningTime="2026-04-23 17:56:42.000939446 +0000 UTC m=+272.236386148" Apr 23 17:56:44.905668 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:44.905633 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7d6947dcbc-j7jjm" Apr 23 17:56:45.966481 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:45.966215 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cb98f6d8f-q9z75"] Apr 23 17:56:45.969899 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:45.969871 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:45.973283 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:45.973262 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 17:56:45.973491 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:45.973468 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 17:56:45.973579 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:45.973320 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 17:56:45.973636 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:45.973604 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lqbwv\"" Apr 23 17:56:45.974473 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:45.974453 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 17:56:45.974569 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:45.974453 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 17:56:45.980847 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:45.980825 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cb98f6d8f-q9z75"] Apr 23 17:56:46.001914 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.001885 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c27726a8-a134-485d-874c-92d5121327e5-console-oauth-config\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.002079 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.001926 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c27726a8-a134-485d-874c-92d5121327e5-console-serving-cert\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.002079 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.002037 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-service-ca\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.002169 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.002104 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-oauth-serving-cert\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.002169 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.002141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7pkb\" (UniqueName: \"kubernetes.io/projected/c27726a8-a134-485d-874c-92d5121327e5-kube-api-access-x7pkb\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.002256 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.002174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-console-config\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.102515 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.102477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c27726a8-a134-485d-874c-92d5121327e5-console-oauth-config\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.102515 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.102514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c27726a8-a134-485d-874c-92d5121327e5-console-serving-cert\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.102726 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.102584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-service-ca\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.102800 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.102724 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-oauth-serving-cert\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.102800 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.102783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7pkb\" (UniqueName: \"kubernetes.io/projected/c27726a8-a134-485d-874c-92d5121327e5-kube-api-access-x7pkb\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.102904 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.102818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-console-config\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.103353 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.103327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-service-ca\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.103676 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.103655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-oauth-serving-cert\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.104440 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.104420 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-console-config\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.105219 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.105201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c27726a8-a134-485d-874c-92d5121327e5-console-serving-cert\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.105294 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.105248 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c27726a8-a134-485d-874c-92d5121327e5-console-oauth-config\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.112061 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.112040 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7pkb\" (UniqueName: \"kubernetes.io/projected/c27726a8-a134-485d-874c-92d5121327e5-kube-api-access-x7pkb\") pod \"console-6cb98f6d8f-q9z75\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:46.281555 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:46.281464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:52.424693 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:52.424665 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cb98f6d8f-q9z75"] Apr 23 17:56:52.427449 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:56:52.427424 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc27726a8_a134_485d_874c_92d5121327e5.slice/crio-622f4833dfeefe306deee3c4e5329c476251f9abf198a6b872c874292e0c1d5c WatchSource:0}: Error finding container 622f4833dfeefe306deee3c4e5329c476251f9abf198a6b872c874292e0c1d5c: Status 404 returned error can't find the container with id 622f4833dfeefe306deee3c4e5329c476251f9abf198a6b872c874292e0c1d5c Apr 23 17:56:53.016678 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:53.016629 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-8dtjc" event={"ID":"2028d82d-64c8-4897-a6c2-1adb482b3e8d","Type":"ContainerStarted","Data":"4521c2b9204b30f6692acb497abdc439d70676cd44cb1561e4257c930b7b4539"} Apr 23 17:56:53.017304 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:53.017257 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-8dtjc" Apr 23 17:56:53.019028 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:53.018998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb98f6d8f-q9z75" event={"ID":"c27726a8-a134-485d-874c-92d5121327e5","Type":"ContainerStarted","Data":"622f4833dfeefe306deee3c4e5329c476251f9abf198a6b872c874292e0c1d5c"} Apr 23 17:56:53.029682 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:53.029655 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-8dtjc" Apr 23 17:56:53.034498 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:53.034446 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-8dtjc" podStartSLOduration=1.846544672 podStartE2EDuration="18.03443117s" podCreationTimestamp="2026-04-23 17:56:35 +0000 UTC" firstStartedPulling="2026-04-23 17:56:36.223412006 +0000 UTC m=+266.458858688" lastFinishedPulling="2026-04-23 17:56:52.411298505 +0000 UTC m=+282.646745186" observedRunningTime="2026-04-23 17:56:53.032902355 +0000 UTC m=+283.268349062" watchObservedRunningTime="2026-04-23 17:56:53.03443117 +0000 UTC m=+283.269877874" Apr 23 17:56:54.477628 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.477586 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-54567bcbb-7ssnc"] Apr 23 17:56:54.505174 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.505128 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54567bcbb-7ssnc"] Apr 23 17:56:54.505425 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.505280 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.518570 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.518488 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 17:56:54.570372 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.570326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-serving-cert\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.570658 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.570388 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-config\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.570658 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.570467 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-trusted-ca-bundle\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.570658 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.570491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-oauth-serving-cert\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.570658 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.570526 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-service-ca\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.570658 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.570549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk7s8\" (UniqueName: \"kubernetes.io/projected/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-kube-api-access-xk7s8\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.570658 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.570608 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-oauth-config\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.671903 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.671778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-serving-cert\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.671903 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.671855 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-config\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.672169 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.671939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-trusted-ca-bundle\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.672169 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.671963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-oauth-serving-cert\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.672169 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.671997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-service-ca\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.672169 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.672017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xk7s8\" (UniqueName: \"kubernetes.io/projected/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-kube-api-access-xk7s8\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.672169 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.672082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-oauth-config\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.673344 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.672896 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-service-ca\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.673566 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.673510 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-trusted-ca-bundle\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.673566 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.673525 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-oauth-serving-cert\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.673794 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.673772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-config\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.675101 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.675073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-serving-cert\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.675101 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.675138 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-oauth-config\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.682079 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.682048 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk7s8\" (UniqueName: \"kubernetes.io/projected/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-kube-api-access-xk7s8\") pod \"console-54567bcbb-7ssnc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.825014 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.824929 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:56:54.903924 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:54.903866 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-x77gx" Apr 23 17:56:55.597158 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:55.597124 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54567bcbb-7ssnc"] Apr 23 17:56:55.815349 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:56:55.815271 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf90c7c1e_5a74_47b2_b56b_e3ec683385dc.slice/crio-64a6aa8c61476cf0c378e1aaf5dda9216d72fe1a95298dce1e5433d8a4f0871c WatchSource:0}: Error finding container 64a6aa8c61476cf0c378e1aaf5dda9216d72fe1a95298dce1e5433d8a4f0871c: Status 404 returned error can't find the container with id 64a6aa8c61476cf0c378e1aaf5dda9216d72fe1a95298dce1e5433d8a4f0871c Apr 23 17:56:56.029813 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:56.029715 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb98f6d8f-q9z75" event={"ID":"c27726a8-a134-485d-874c-92d5121327e5","Type":"ContainerStarted","Data":"76c171dd4781bd650cde38dc32f6d7615005649a7f56d02349eccb7c44cc47b3"} Apr 23 17:56:56.031677 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:56.031643 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54567bcbb-7ssnc" event={"ID":"f90c7c1e-5a74-47b2-b56b-e3ec683385dc","Type":"ContainerStarted","Data":"769e0589d39a28a6e42c8d5177cd9674f89eba81546d33f38c017e4aad501347"} Apr 23 17:56:56.031825 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:56.031681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54567bcbb-7ssnc" event={"ID":"f90c7c1e-5a74-47b2-b56b-e3ec683385dc","Type":"ContainerStarted","Data":"64a6aa8c61476cf0c378e1aaf5dda9216d72fe1a95298dce1e5433d8a4f0871c"} Apr 23 17:56:56.050328 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:56.050281 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cb98f6d8f-q9z75" podStartSLOduration=7.635916356 podStartE2EDuration="11.050265421s" podCreationTimestamp="2026-04-23 17:56:45 +0000 UTC" firstStartedPulling="2026-04-23 17:56:52.429764432 +0000 UTC m=+282.665211115" lastFinishedPulling="2026-04-23 17:56:55.844113492 +0000 UTC m=+286.079560180" observedRunningTime="2026-04-23 17:56:56.049099823 +0000 UTC m=+286.284546529" watchObservedRunningTime="2026-04-23 17:56:56.050265421 +0000 UTC m=+286.285712124" Apr 23 17:56:56.067668 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:56.067562 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54567bcbb-7ssnc" podStartSLOduration=2.067542039 podStartE2EDuration="2.067542039s" podCreationTimestamp="2026-04-23 17:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:56:56.065901489 +0000 UTC m=+286.301348196" watchObservedRunningTime="2026-04-23 17:56:56.067542039 +0000 UTC m=+286.302988746" Apr 23 17:56:56.282569 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:56.282526 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:56.282569 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:56.282576 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:56.288134 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:56.288103 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:56:56.716999 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:56.716955 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:56.717463 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:56.717037 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:56:57.040615 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:56:57.040528 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:57:04.825648 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:04.825608 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:57:04.825648 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:04.825650 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:57:04.830535 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:04.830509 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:57:05.062831 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:05.062805 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:57:05.107549 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:05.107462 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cb98f6d8f-q9z75"] Apr 23 17:57:10.278769 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:10.278717 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6g56n_ae56a92f-dfae-4763-b849-dca72bc2cf3d/kube-multus/0.log" Apr 23 17:57:10.279964 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:10.279942 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6g56n_ae56a92f-dfae-4763-b849-dca72bc2cf3d/kube-multus/0.log" Apr 23 17:57:10.283448 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:10.283428 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 17:57:11.721101 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:11.721058 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs\") pod \"network-metrics-daemon-jfhpv\" (UID: \"5baefb5e-77f1-440a-918c-82da4620b8d7\") " pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:57:11.739028 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:11.723284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5baefb5e-77f1-440a-918c-82da4620b8d7-metrics-certs\") pod \"network-metrics-daemon-jfhpv\" (UID: \"5baefb5e-77f1-440a-918c-82da4620b8d7\") " pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:57:11.803689 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:11.803653 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-s85dw\"" Apr 23 17:57:11.809993 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:11.809974 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jfhpv" Apr 23 17:57:11.924374 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:11.924341 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jfhpv"] Apr 23 17:57:11.928219 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:57:11.928190 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5baefb5e_77f1_440a_918c_82da4620b8d7.slice/crio-62fcd2230db5ec229ad870843295012bb189ce14ada780233a7fde59e93c2c8e WatchSource:0}: Error finding container 62fcd2230db5ec229ad870843295012bb189ce14ada780233a7fde59e93c2c8e: Status 404 returned error can't find the container with id 62fcd2230db5ec229ad870843295012bb189ce14ada780233a7fde59e93c2c8e Apr 23 17:57:11.930119 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:11.930101 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:57:12.082587 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:12.082502 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jfhpv" event={"ID":"5baefb5e-77f1-440a-918c-82da4620b8d7","Type":"ContainerStarted","Data":"62fcd2230db5ec229ad870843295012bb189ce14ada780233a7fde59e93c2c8e"} Apr 23 17:57:14.089012 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:14.088981 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jfhpv" event={"ID":"5baefb5e-77f1-440a-918c-82da4620b8d7","Type":"ContainerStarted","Data":"158fa06a8f3170ac21c563eea47bbbb2003a4f8593454259dcd035da13842fb4"} Apr 23 17:57:14.089012 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:14.089014 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jfhpv" event={"ID":"5baefb5e-77f1-440a-918c-82da4620b8d7","Type":"ContainerStarted","Data":"c6a8c75f93d75c9577b14a8c2b8c44febcbee720c7bb7ce18bd3a0f669ef4d70"} Apr 23 17:57:14.106283 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:14.105769 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jfhpv" podStartSLOduration=129.718294721 podStartE2EDuration="2m11.105737063s" podCreationTimestamp="2026-04-23 17:55:03 +0000 UTC" firstStartedPulling="2026-04-23 17:57:11.930285097 +0000 UTC m=+302.165731783" lastFinishedPulling="2026-04-23 17:57:13.317727443 +0000 UTC m=+303.553174125" observedRunningTime="2026-04-23 17:57:14.105233129 +0000 UTC m=+304.340679833" watchObservedRunningTime="2026-04-23 17:57:14.105737063 +0000 UTC m=+304.341183859" Apr 23 17:57:16.721662 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:16.721635 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:57:16.725411 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:16.725389 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-594fb98f6c-rmldp" Apr 23 17:57:29.365048 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.365007 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c4bd476c5-768ng"] Apr 23 17:57:29.396231 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.396197 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c4bd476c5-768ng"] Apr 23 17:57:29.396383 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.396295 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c4bd476c5-768ng" Apr 23 17:57:29.398389 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.398363 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-xzwk9\"" Apr 23 17:57:29.398578 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.398553 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 17:57:29.398976 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.398940 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 17:57:29.399077 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.399026 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 17:57:29.399283 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.399268 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 17:57:29.473870 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.473835 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq"] Apr 23 17:57:29.494251 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.494224 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq"] Apr 23 17:57:29.494397 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.494336 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.496603 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.496576 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 17:57:29.496733 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.496643 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 17:57:29.496848 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.496824 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 17:57:29.496965 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.496889 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 17:57:29.573753 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.573707 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8aa48703-0783-4477-88ed-8cd526e10ec7-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-c4bd476c5-768ng\" (UID: \"8aa48703-0783-4477-88ed-8cd526e10ec7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c4bd476c5-768ng" Apr 23 17:57:29.573944 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.573848 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cz29\" (UniqueName: \"kubernetes.io/projected/8aa48703-0783-4477-88ed-8cd526e10ec7-kube-api-access-8cz29\") pod \"managed-serviceaccount-addon-agent-c4bd476c5-768ng\" (UID: \"8aa48703-0783-4477-88ed-8cd526e10ec7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c4bd476c5-768ng" Apr 23 17:57:29.674675 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.674587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8aa48703-0783-4477-88ed-8cd526e10ec7-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-c4bd476c5-768ng\" (UID: \"8aa48703-0783-4477-88ed-8cd526e10ec7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c4bd476c5-768ng" Apr 23 17:57:29.674675 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.674639 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/580983ea-e84c-432c-881c-8e8ed1d84f30-ca\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.674675 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.674667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/580983ea-e84c-432c-881c-8e8ed1d84f30-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.674982 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.674684 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlbk9\" (UniqueName: \"kubernetes.io/projected/580983ea-e84c-432c-881c-8e8ed1d84f30-kube-api-access-zlbk9\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.674982 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.674720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/580983ea-e84c-432c-881c-8e8ed1d84f30-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.674982 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.674735 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/580983ea-e84c-432c-881c-8e8ed1d84f30-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.674982 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.674837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cz29\" (UniqueName: \"kubernetes.io/projected/8aa48703-0783-4477-88ed-8cd526e10ec7-kube-api-access-8cz29\") pod \"managed-serviceaccount-addon-agent-c4bd476c5-768ng\" (UID: \"8aa48703-0783-4477-88ed-8cd526e10ec7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c4bd476c5-768ng" Apr 23 17:57:29.674982 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.674901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/580983ea-e84c-432c-881c-8e8ed1d84f30-hub\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.677151 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.677127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8aa48703-0783-4477-88ed-8cd526e10ec7-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-c4bd476c5-768ng\" (UID: \"8aa48703-0783-4477-88ed-8cd526e10ec7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c4bd476c5-768ng" Apr 23 17:57:29.682649 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.682624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cz29\" (UniqueName: \"kubernetes.io/projected/8aa48703-0783-4477-88ed-8cd526e10ec7-kube-api-access-8cz29\") pod \"managed-serviceaccount-addon-agent-c4bd476c5-768ng\" (UID: \"8aa48703-0783-4477-88ed-8cd526e10ec7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c4bd476c5-768ng" Apr 23 17:57:29.713812 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.713780 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c4bd476c5-768ng" Apr 23 17:57:29.775452 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.775418 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/580983ea-e84c-432c-881c-8e8ed1d84f30-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.775601 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.775464 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/580983ea-e84c-432c-881c-8e8ed1d84f30-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.775601 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.775566 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/580983ea-e84c-432c-881c-8e8ed1d84f30-hub\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.775706 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.775612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/580983ea-e84c-432c-881c-8e8ed1d84f30-ca\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.775706 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.775649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/580983ea-e84c-432c-881c-8e8ed1d84f30-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.775706 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.775675 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlbk9\" (UniqueName: \"kubernetes.io/projected/580983ea-e84c-432c-881c-8e8ed1d84f30-kube-api-access-zlbk9\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.776263 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.776237 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/580983ea-e84c-432c-881c-8e8ed1d84f30-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.778255 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.778218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/580983ea-e84c-432c-881c-8e8ed1d84f30-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.778357 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.778338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/580983ea-e84c-432c-881c-8e8ed1d84f30-ca\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.778600 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.778578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/580983ea-e84c-432c-881c-8e8ed1d84f30-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.778700 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.778682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/580983ea-e84c-432c-881c-8e8ed1d84f30-hub\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.784090 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.784069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlbk9\" (UniqueName: \"kubernetes.io/projected/580983ea-e84c-432c-881c-8e8ed1d84f30-kube-api-access-zlbk9\") pod \"cluster-proxy-proxy-agent-858bdf7ffb-mqgfq\" (UID: \"580983ea-e84c-432c-881c-8e8ed1d84f30\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.803377 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.803339 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" Apr 23 17:57:29.831073 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.831046 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c4bd476c5-768ng"] Apr 23 17:57:29.834099 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:57:29.834073 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8aa48703_0783_4477_88ed_8cd526e10ec7.slice/crio-da0137f4daf6a72d139c4db7158a8673db8a765b677cb8965e4e1c376499c6bd WatchSource:0}: Error finding container da0137f4daf6a72d139c4db7158a8673db8a765b677cb8965e4e1c376499c6bd: Status 404 returned error can't find the container with id da0137f4daf6a72d139c4db7158a8673db8a765b677cb8965e4e1c376499c6bd Apr 23 17:57:29.927475 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:29.927398 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq"] Apr 23 17:57:29.930285 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:57:29.930257 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod580983ea_e84c_432c_881c_8e8ed1d84f30.slice/crio-103be37f14f59a47397f234821d830a79314d364ac62f0ecfc13ef34218d3ce5 WatchSource:0}: Error finding container 103be37f14f59a47397f234821d830a79314d364ac62f0ecfc13ef34218d3ce5: Status 404 returned error can't find the container with id 103be37f14f59a47397f234821d830a79314d364ac62f0ecfc13ef34218d3ce5 Apr 23 17:57:30.128115 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.128071 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6cb98f6d8f-q9z75" podUID="c27726a8-a134-485d-874c-92d5121327e5" containerName="console" containerID="cri-o://76c171dd4781bd650cde38dc32f6d7615005649a7f56d02349eccb7c44cc47b3" gracePeriod=15 Apr 23 17:57:30.135949 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.135920 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" event={"ID":"580983ea-e84c-432c-881c-8e8ed1d84f30","Type":"ContainerStarted","Data":"103be37f14f59a47397f234821d830a79314d364ac62f0ecfc13ef34218d3ce5"} Apr 23 17:57:30.136860 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.136839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c4bd476c5-768ng" event={"ID":"8aa48703-0783-4477-88ed-8cd526e10ec7","Type":"ContainerStarted","Data":"da0137f4daf6a72d139c4db7158a8673db8a765b677cb8965e4e1c376499c6bd"} Apr 23 17:57:30.391911 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.391884 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cb98f6d8f-q9z75_c27726a8-a134-485d-874c-92d5121327e5/console/0.log" Apr 23 17:57:30.392542 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.392523 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:57:30.582239 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.582152 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c27726a8-a134-485d-874c-92d5121327e5-console-oauth-config\") pod \"c27726a8-a134-485d-874c-92d5121327e5\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " Apr 23 17:57:30.582398 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.582256 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7pkb\" (UniqueName: \"kubernetes.io/projected/c27726a8-a134-485d-874c-92d5121327e5-kube-api-access-x7pkb\") pod \"c27726a8-a134-485d-874c-92d5121327e5\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " Apr 23 17:57:30.582398 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.582310 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-oauth-serving-cert\") pod \"c27726a8-a134-485d-874c-92d5121327e5\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " Apr 23 17:57:30.582398 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.582336 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-console-config\") pod \"c27726a8-a134-485d-874c-92d5121327e5\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " Apr 23 17:57:30.582398 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.582376 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c27726a8-a134-485d-874c-92d5121327e5-console-serving-cert\") pod \"c27726a8-a134-485d-874c-92d5121327e5\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " Apr 23 17:57:30.582614 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.582413 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-service-ca\") pod \"c27726a8-a134-485d-874c-92d5121327e5\" (UID: \"c27726a8-a134-485d-874c-92d5121327e5\") " Apr 23 17:57:30.582825 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.582786 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-console-config" (OuterVolumeSpecName: "console-config") pod "c27726a8-a134-485d-874c-92d5121327e5" (UID: "c27726a8-a134-485d-874c-92d5121327e5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:30.582978 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.582800 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c27726a8-a134-485d-874c-92d5121327e5" (UID: "c27726a8-a134-485d-874c-92d5121327e5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:30.583356 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.583320 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-service-ca" (OuterVolumeSpecName: "service-ca") pod "c27726a8-a134-485d-874c-92d5121327e5" (UID: "c27726a8-a134-485d-874c-92d5121327e5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:30.585042 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.585015 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c27726a8-a134-485d-874c-92d5121327e5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c27726a8-a134-485d-874c-92d5121327e5" (UID: "c27726a8-a134-485d-874c-92d5121327e5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:30.585143 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.585082 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c27726a8-a134-485d-874c-92d5121327e5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c27726a8-a134-485d-874c-92d5121327e5" (UID: "c27726a8-a134-485d-874c-92d5121327e5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:30.585143 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.585099 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c27726a8-a134-485d-874c-92d5121327e5-kube-api-access-x7pkb" (OuterVolumeSpecName: "kube-api-access-x7pkb") pod "c27726a8-a134-485d-874c-92d5121327e5" (UID: "c27726a8-a134-485d-874c-92d5121327e5"). InnerVolumeSpecName "kube-api-access-x7pkb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:57:30.683518 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.683476 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x7pkb\" (UniqueName: \"kubernetes.io/projected/c27726a8-a134-485d-874c-92d5121327e5-kube-api-access-x7pkb\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:30.683518 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.683516 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-oauth-serving-cert\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:30.683754 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.683531 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-console-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:30.683754 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.683546 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c27726a8-a134-485d-874c-92d5121327e5-console-serving-cert\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:30.683754 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.683560 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c27726a8-a134-485d-874c-92d5121327e5-service-ca\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:30.683754 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:30.683574 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c27726a8-a134-485d-874c-92d5121327e5-console-oauth-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:31.142773 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:31.142394 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cb98f6d8f-q9z75_c27726a8-a134-485d-874c-92d5121327e5/console/0.log" Apr 23 17:57:31.142773 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:31.142455 2576 generic.go:358] "Generic (PLEG): container finished" podID="c27726a8-a134-485d-874c-92d5121327e5" containerID="76c171dd4781bd650cde38dc32f6d7615005649a7f56d02349eccb7c44cc47b3" exitCode=2 Apr 23 17:57:31.142773 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:31.142532 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb98f6d8f-q9z75" event={"ID":"c27726a8-a134-485d-874c-92d5121327e5","Type":"ContainerDied","Data":"76c171dd4781bd650cde38dc32f6d7615005649a7f56d02349eccb7c44cc47b3"} Apr 23 17:57:31.142773 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:31.142564 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb98f6d8f-q9z75" event={"ID":"c27726a8-a134-485d-874c-92d5121327e5","Type":"ContainerDied","Data":"622f4833dfeefe306deee3c4e5329c476251f9abf198a6b872c874292e0c1d5c"} Apr 23 17:57:31.142773 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:31.142584 2576 scope.go:117] "RemoveContainer" containerID="76c171dd4781bd650cde38dc32f6d7615005649a7f56d02349eccb7c44cc47b3" Apr 23 17:57:31.144895 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:31.143244 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb98f6d8f-q9z75" Apr 23 17:57:31.161695 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:31.161666 2576 scope.go:117] "RemoveContainer" containerID="76c171dd4781bd650cde38dc32f6d7615005649a7f56d02349eccb7c44cc47b3" Apr 23 17:57:31.162354 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:57:31.162269 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76c171dd4781bd650cde38dc32f6d7615005649a7f56d02349eccb7c44cc47b3\": container with ID starting with 76c171dd4781bd650cde38dc32f6d7615005649a7f56d02349eccb7c44cc47b3 not found: ID does not exist" containerID="76c171dd4781bd650cde38dc32f6d7615005649a7f56d02349eccb7c44cc47b3" Apr 23 17:57:31.162354 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:31.162305 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76c171dd4781bd650cde38dc32f6d7615005649a7f56d02349eccb7c44cc47b3"} err="failed to get container status \"76c171dd4781bd650cde38dc32f6d7615005649a7f56d02349eccb7c44cc47b3\": rpc error: code = NotFound desc = could not find container \"76c171dd4781bd650cde38dc32f6d7615005649a7f56d02349eccb7c44cc47b3\": container with ID starting with 76c171dd4781bd650cde38dc32f6d7615005649a7f56d02349eccb7c44cc47b3 not found: ID does not exist" Apr 23 17:57:31.172782 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:31.172725 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cb98f6d8f-q9z75"] Apr 23 17:57:31.175056 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:31.175031 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6cb98f6d8f-q9z75"] Apr 23 17:57:32.396061 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:32.396027 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c27726a8-a134-485d-874c-92d5121327e5" path="/var/lib/kubelet/pods/c27726a8-a134-485d-874c-92d5121327e5/volumes" Apr 23 17:57:35.157670 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:35.157636 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" event={"ID":"580983ea-e84c-432c-881c-8e8ed1d84f30","Type":"ContainerStarted","Data":"c668bb0a0464c4730b4929bd6ae809ac7b38a27f7a0d745c0cd2e93ce599589e"} Apr 23 17:57:35.159167 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:35.159138 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c4bd476c5-768ng" event={"ID":"8aa48703-0783-4477-88ed-8cd526e10ec7","Type":"ContainerStarted","Data":"98a0e2c46ef97546d3273bcaf53233c6a629103e6b656b7c338a36f34631dd6f"} Apr 23 17:57:35.173511 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:35.173436 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c4bd476c5-768ng" podStartSLOduration=1.709667934 podStartE2EDuration="6.17341978s" podCreationTimestamp="2026-04-23 17:57:29 +0000 UTC" firstStartedPulling="2026-04-23 17:57:29.836413204 +0000 UTC m=+320.071859885" lastFinishedPulling="2026-04-23 17:57:34.300165036 +0000 UTC m=+324.535611731" observedRunningTime="2026-04-23 17:57:35.172656845 +0000 UTC m=+325.408103549" watchObservedRunningTime="2026-04-23 17:57:35.17341978 +0000 UTC m=+325.408866484" Apr 23 17:57:37.165994 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:37.165961 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" event={"ID":"580983ea-e84c-432c-881c-8e8ed1d84f30","Type":"ContainerStarted","Data":"eab4232e937a9d917931acd2bb54db9a7478759d5966069a6c0c2e84a30bc3d4"} Apr 23 17:57:37.165994 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:37.165998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" event={"ID":"580983ea-e84c-432c-881c-8e8ed1d84f30","Type":"ContainerStarted","Data":"d9c1eae89a78eb68e00fd73b4bee364db3586923f0eb441822f785a068416bc8"} Apr 23 17:57:37.184665 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:37.184621 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858bdf7ffb-mqgfq" podStartSLOduration=1.239537696 podStartE2EDuration="8.184603345s" podCreationTimestamp="2026-04-23 17:57:29 +0000 UTC" firstStartedPulling="2026-04-23 17:57:29.932064517 +0000 UTC m=+320.167511202" lastFinishedPulling="2026-04-23 17:57:36.877130157 +0000 UTC m=+327.112576851" observedRunningTime="2026-04-23 17:57:37.182772958 +0000 UTC m=+327.418219657" watchObservedRunningTime="2026-04-23 17:57:37.184603345 +0000 UTC m=+327.420050052" Apr 23 17:57:52.444094 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.444062 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:57:52.444508 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.444465 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="alertmanager" containerID="cri-o://f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7" gracePeriod=120 Apr 23 17:57:52.444592 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.444545 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="kube-rbac-proxy-metric" containerID="cri-o://6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf" gracePeriod=120 Apr 23 17:57:52.444731 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.444631 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="kube-rbac-proxy-web" containerID="cri-o://9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5" gracePeriod=120 Apr 23 17:57:52.444731 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.444583 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="config-reloader" containerID="cri-o://c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d" gracePeriod=120 Apr 23 17:57:52.444731 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.444583 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="prom-label-proxy" containerID="cri-o://a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442" gracePeriod=120 Apr 23 17:57:52.444941 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.444620 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="kube-rbac-proxy" containerID="cri-o://0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c" gracePeriod=120 Apr 23 17:57:52.662764 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.662714 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56869659c-xlgm4"] Apr 23 17:57:52.663023 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.663010 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c27726a8-a134-485d-874c-92d5121327e5" containerName="console" Apr 23 17:57:52.663085 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.663024 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c27726a8-a134-485d-874c-92d5121327e5" containerName="console" Apr 23 17:57:52.663085 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.663078 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c27726a8-a134-485d-874c-92d5121327e5" containerName="console" Apr 23 17:57:52.665887 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.665860 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.673608 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.673585 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56869659c-xlgm4"] Apr 23 17:57:52.749819 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.749783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a91a7fa7-a54b-4022-b381-1f1f05e156b0-service-ca\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.749819 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.749822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a91a7fa7-a54b-4022-b381-1f1f05e156b0-oauth-serving-cert\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.750011 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.749930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a91a7fa7-a54b-4022-b381-1f1f05e156b0-console-oauth-config\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.750011 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.749961 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a91a7fa7-a54b-4022-b381-1f1f05e156b0-console-serving-cert\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.750011 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.749980 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a91a7fa7-a54b-4022-b381-1f1f05e156b0-trusted-ca-bundle\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.750011 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.750001 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a91a7fa7-a54b-4022-b381-1f1f05e156b0-console-config\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.750130 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.750028 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58f92\" (UniqueName: \"kubernetes.io/projected/a91a7fa7-a54b-4022-b381-1f1f05e156b0-kube-api-access-58f92\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.850668 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.850572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a91a7fa7-a54b-4022-b381-1f1f05e156b0-console-oauth-config\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.850668 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.850616 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a91a7fa7-a54b-4022-b381-1f1f05e156b0-console-serving-cert\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.850668 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.850637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a91a7fa7-a54b-4022-b381-1f1f05e156b0-trusted-ca-bundle\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.850668 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.850664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a91a7fa7-a54b-4022-b381-1f1f05e156b0-console-config\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.851017 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.850778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58f92\" (UniqueName: \"kubernetes.io/projected/a91a7fa7-a54b-4022-b381-1f1f05e156b0-kube-api-access-58f92\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.851017 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.850850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a91a7fa7-a54b-4022-b381-1f1f05e156b0-service-ca\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.851017 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.850883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a91a7fa7-a54b-4022-b381-1f1f05e156b0-oauth-serving-cert\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.851386 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.851365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a91a7fa7-a54b-4022-b381-1f1f05e156b0-console-config\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.851478 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.851454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a91a7fa7-a54b-4022-b381-1f1f05e156b0-service-ca\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.851580 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.851564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a91a7fa7-a54b-4022-b381-1f1f05e156b0-oauth-serving-cert\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.851656 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.851639 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a91a7fa7-a54b-4022-b381-1f1f05e156b0-trusted-ca-bundle\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.853627 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.853607 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a91a7fa7-a54b-4022-b381-1f1f05e156b0-console-serving-cert\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.853712 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.853697 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a91a7fa7-a54b-4022-b381-1f1f05e156b0-console-oauth-config\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.858045 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.858026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58f92\" (UniqueName: \"kubernetes.io/projected/a91a7fa7-a54b-4022-b381-1f1f05e156b0-kube-api-access-58f92\") pod \"console-56869659c-xlgm4\" (UID: \"a91a7fa7-a54b-4022-b381-1f1f05e156b0\") " pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:52.975455 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:52.975421 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:57:53.105795 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.105763 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56869659c-xlgm4"] Apr 23 17:57:53.108316 ip-10-0-132-102 kubenswrapper[2576]: W0423 17:57:53.108284 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda91a7fa7_a54b_4022_b381_1f1f05e156b0.slice/crio-1ab92b6728f256c397cfe5ccec2452d00af5d27e93f0a22b839ab5016095402b WatchSource:0}: Error finding container 1ab92b6728f256c397cfe5ccec2452d00af5d27e93f0a22b839ab5016095402b: Status 404 returned error can't find the container with id 1ab92b6728f256c397cfe5ccec2452d00af5d27e93f0a22b839ab5016095402b Apr 23 17:57:53.211373 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.211330 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56869659c-xlgm4" event={"ID":"a91a7fa7-a54b-4022-b381-1f1f05e156b0","Type":"ContainerStarted","Data":"bf624bc1d43db082d279000de09eaf9bdcab3c468f2201a1ac04346dbf0fe176"} Apr 23 17:57:53.211373 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.211378 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56869659c-xlgm4" event={"ID":"a91a7fa7-a54b-4022-b381-1f1f05e156b0","Type":"ContainerStarted","Data":"1ab92b6728f256c397cfe5ccec2452d00af5d27e93f0a22b839ab5016095402b"} Apr 23 17:57:53.214507 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.214482 2576 generic.go:358] "Generic (PLEG): container finished" podID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerID="a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442" exitCode=0 Apr 23 17:57:53.214507 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.214505 2576 generic.go:358] "Generic (PLEG): container finished" podID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerID="0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c" exitCode=0 Apr 23 17:57:53.214507 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.214512 2576 generic.go:358] "Generic (PLEG): container finished" podID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerID="c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d" exitCode=0 Apr 23 17:57:53.214674 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.214517 2576 generic.go:358] "Generic (PLEG): container finished" podID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerID="f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7" exitCode=0 Apr 23 17:57:53.214674 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.214544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"392b13b6-7939-4aac-8409-2fcb938f87a3","Type":"ContainerDied","Data":"a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442"} Apr 23 17:57:53.214674 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.214562 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"392b13b6-7939-4aac-8409-2fcb938f87a3","Type":"ContainerDied","Data":"0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c"} Apr 23 17:57:53.214674 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.214574 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"392b13b6-7939-4aac-8409-2fcb938f87a3","Type":"ContainerDied","Data":"c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d"} Apr 23 17:57:53.214674 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.214614 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"392b13b6-7939-4aac-8409-2fcb938f87a3","Type":"ContainerDied","Data":"f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7"} Apr 23 17:57:53.227945 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.227901 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56869659c-xlgm4" podStartSLOduration=1.2278870259999999 podStartE2EDuration="1.227887026s" podCreationTimestamp="2026-04-23 17:57:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:57:53.226926895 +0000 UTC m=+343.462373602" watchObservedRunningTime="2026-04-23 17:57:53.227887026 +0000 UTC m=+343.463333729" Apr 23 17:57:53.676486 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.676465 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:53.759505 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.759416 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/392b13b6-7939-4aac-8409-2fcb938f87a3-metrics-client-ca\") pod \"392b13b6-7939-4aac-8409-2fcb938f87a3\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " Apr 23 17:57:53.759505 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.759466 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-main-tls\") pod \"392b13b6-7939-4aac-8409-2fcb938f87a3\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " Apr 23 17:57:53.759505 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.759488 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy-web\") pod \"392b13b6-7939-4aac-8409-2fcb938f87a3\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " Apr 23 17:57:53.759789 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.759516 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/392b13b6-7939-4aac-8409-2fcb938f87a3-tls-assets\") pod \"392b13b6-7939-4aac-8409-2fcb938f87a3\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " Apr 23 17:57:53.759789 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.759532 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy\") pod \"392b13b6-7939-4aac-8409-2fcb938f87a3\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " Apr 23 17:57:53.759789 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.759548 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jtb4\" (UniqueName: \"kubernetes.io/projected/392b13b6-7939-4aac-8409-2fcb938f87a3-kube-api-access-4jtb4\") pod \"392b13b6-7939-4aac-8409-2fcb938f87a3\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " Apr 23 17:57:53.759789 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.759568 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-web-config\") pod \"392b13b6-7939-4aac-8409-2fcb938f87a3\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " Apr 23 17:57:53.759789 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.759588 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-cluster-tls-config\") pod \"392b13b6-7939-4aac-8409-2fcb938f87a3\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " Apr 23 17:57:53.759789 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.759629 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/392b13b6-7939-4aac-8409-2fcb938f87a3-alertmanager-main-db\") pod \"392b13b6-7939-4aac-8409-2fcb938f87a3\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " Apr 23 17:57:53.759789 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.759656 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/392b13b6-7939-4aac-8409-2fcb938f87a3-alertmanager-trusted-ca-bundle\") pod \"392b13b6-7939-4aac-8409-2fcb938f87a3\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " Apr 23 17:57:53.759789 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.759684 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/392b13b6-7939-4aac-8409-2fcb938f87a3-config-out\") pod \"392b13b6-7939-4aac-8409-2fcb938f87a3\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " Apr 23 17:57:53.759789 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.759703 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-config-volume\") pod \"392b13b6-7939-4aac-8409-2fcb938f87a3\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " Apr 23 17:57:53.759789 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.759755 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"392b13b6-7939-4aac-8409-2fcb938f87a3\" (UID: \"392b13b6-7939-4aac-8409-2fcb938f87a3\") " Apr 23 17:57:53.760261 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.759946 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/392b13b6-7939-4aac-8409-2fcb938f87a3-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "392b13b6-7939-4aac-8409-2fcb938f87a3" (UID: "392b13b6-7939-4aac-8409-2fcb938f87a3"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:53.760261 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.760252 2576 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/392b13b6-7939-4aac-8409-2fcb938f87a3-metrics-client-ca\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:53.760666 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.760638 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/392b13b6-7939-4aac-8409-2fcb938f87a3-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "392b13b6-7939-4aac-8409-2fcb938f87a3" (UID: "392b13b6-7939-4aac-8409-2fcb938f87a3"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:53.761551 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.761521 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/392b13b6-7939-4aac-8409-2fcb938f87a3-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "392b13b6-7939-4aac-8409-2fcb938f87a3" (UID: "392b13b6-7939-4aac-8409-2fcb938f87a3"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:57:53.762677 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.762633 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "392b13b6-7939-4aac-8409-2fcb938f87a3" (UID: "392b13b6-7939-4aac-8409-2fcb938f87a3"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:53.763100 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.763074 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392b13b6-7939-4aac-8409-2fcb938f87a3-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "392b13b6-7939-4aac-8409-2fcb938f87a3" (UID: "392b13b6-7939-4aac-8409-2fcb938f87a3"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:57:53.763261 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.763227 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "392b13b6-7939-4aac-8409-2fcb938f87a3" (UID: "392b13b6-7939-4aac-8409-2fcb938f87a3"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:53.763373 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.763290 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392b13b6-7939-4aac-8409-2fcb938f87a3-kube-api-access-4jtb4" (OuterVolumeSpecName: "kube-api-access-4jtb4") pod "392b13b6-7939-4aac-8409-2fcb938f87a3" (UID: "392b13b6-7939-4aac-8409-2fcb938f87a3"). InnerVolumeSpecName "kube-api-access-4jtb4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:57:53.763373 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.763350 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "392b13b6-7939-4aac-8409-2fcb938f87a3" (UID: "392b13b6-7939-4aac-8409-2fcb938f87a3"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:53.763528 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.763508 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "392b13b6-7939-4aac-8409-2fcb938f87a3" (UID: "392b13b6-7939-4aac-8409-2fcb938f87a3"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:53.763717 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.763697 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/392b13b6-7939-4aac-8409-2fcb938f87a3-config-out" (OuterVolumeSpecName: "config-out") pod "392b13b6-7939-4aac-8409-2fcb938f87a3" (UID: "392b13b6-7939-4aac-8409-2fcb938f87a3"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:57:53.764081 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.764060 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-config-volume" (OuterVolumeSpecName: "config-volume") pod "392b13b6-7939-4aac-8409-2fcb938f87a3" (UID: "392b13b6-7939-4aac-8409-2fcb938f87a3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:53.766702 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.766685 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "392b13b6-7939-4aac-8409-2fcb938f87a3" (UID: "392b13b6-7939-4aac-8409-2fcb938f87a3"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:53.772264 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.772245 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-web-config" (OuterVolumeSpecName: "web-config") pod "392b13b6-7939-4aac-8409-2fcb938f87a3" (UID: "392b13b6-7939-4aac-8409-2fcb938f87a3"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:53.860730 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.860686 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/392b13b6-7939-4aac-8409-2fcb938f87a3-alertmanager-main-db\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:53.860730 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.860729 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/392b13b6-7939-4aac-8409-2fcb938f87a3-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:53.860730 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.860764 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/392b13b6-7939-4aac-8409-2fcb938f87a3-config-out\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:53.860981 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.860775 2576 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-config-volume\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:53.860981 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.860784 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:53.860981 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.860794 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-main-tls\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:53.860981 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.860803 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:53.860981 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.860811 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/392b13b6-7939-4aac-8409-2fcb938f87a3-tls-assets\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:53.860981 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.860819 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:53.860981 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.860828 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4jtb4\" (UniqueName: \"kubernetes.io/projected/392b13b6-7939-4aac-8409-2fcb938f87a3-kube-api-access-4jtb4\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:53.860981 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.860836 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-web-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:53.860981 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:53.860844 2576 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/392b13b6-7939-4aac-8409-2fcb938f87a3-cluster-tls-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:57:54.220569 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.220533 2576 generic.go:358] "Generic (PLEG): container finished" podID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerID="6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf" exitCode=0 Apr 23 17:57:54.220569 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.220563 2576 generic.go:358] "Generic (PLEG): container finished" podID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerID="9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5" exitCode=0 Apr 23 17:57:54.220779 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.220613 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"392b13b6-7939-4aac-8409-2fcb938f87a3","Type":"ContainerDied","Data":"6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf"} Apr 23 17:57:54.220779 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.220653 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"392b13b6-7939-4aac-8409-2fcb938f87a3","Type":"ContainerDied","Data":"9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5"} Apr 23 17:57:54.220779 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.220664 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"392b13b6-7939-4aac-8409-2fcb938f87a3","Type":"ContainerDied","Data":"3c875c0491788610b9dd619df49115ea7f5763b9b6ddb37ccab21f50dae35bf3"} Apr 23 17:57:54.220779 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.220680 2576 scope.go:117] "RemoveContainer" containerID="a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442" Apr 23 17:57:54.220779 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.220683 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.228061 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.228041 2576 scope.go:117] "RemoveContainer" containerID="6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf" Apr 23 17:57:54.234834 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.234810 2576 scope.go:117] "RemoveContainer" containerID="0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c" Apr 23 17:57:54.241113 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.241097 2576 scope.go:117] "RemoveContainer" containerID="9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5" Apr 23 17:57:54.243366 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.243343 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:57:54.247339 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.247318 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:57:54.248055 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.248041 2576 scope.go:117] "RemoveContainer" containerID="c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d" Apr 23 17:57:54.254472 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.254456 2576 scope.go:117] "RemoveContainer" containerID="f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7" Apr 23 17:57:54.260351 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.260334 2576 scope.go:117] "RemoveContainer" containerID="cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8" Apr 23 17:57:54.266600 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.266584 2576 scope.go:117] "RemoveContainer" containerID="a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442" Apr 23 17:57:54.266851 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:57:54.266833 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442\": container with ID starting with a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442 not found: ID does not exist" containerID="a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442" Apr 23 17:57:54.266912 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.266858 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442"} err="failed to get container status \"a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442\": rpc error: code = NotFound desc = could not find container \"a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442\": container with ID starting with a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442 not found: ID does not exist" Apr 23 17:57:54.266912 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.266876 2576 scope.go:117] "RemoveContainer" containerID="6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf" Apr 23 17:57:54.267118 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:57:54.267101 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf\": container with ID starting with 6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf not found: ID does not exist" containerID="6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf" Apr 23 17:57:54.267161 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.267123 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf"} err="failed to get container status \"6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf\": rpc error: code = NotFound desc = could not find container \"6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf\": container with ID starting with 6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf not found: ID does not exist" Apr 23 17:57:54.267161 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.267140 2576 scope.go:117] "RemoveContainer" containerID="0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c" Apr 23 17:57:54.267347 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:57:54.267331 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c\": container with ID starting with 0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c not found: ID does not exist" containerID="0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c" Apr 23 17:57:54.267407 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.267355 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c"} err="failed to get container status \"0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c\": rpc error: code = NotFound desc = could not find container \"0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c\": container with ID starting with 0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c not found: ID does not exist" Apr 23 17:57:54.267407 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.267376 2576 scope.go:117] "RemoveContainer" containerID="9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5" Apr 23 17:57:54.267564 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:57:54.267549 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5\": container with ID starting with 9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5 not found: ID does not exist" containerID="9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5" Apr 23 17:57:54.267606 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.267567 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5"} err="failed to get container status \"9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5\": rpc error: code = NotFound desc = could not find container \"9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5\": container with ID starting with 9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5 not found: ID does not exist" Apr 23 17:57:54.267606 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.267580 2576 scope.go:117] "RemoveContainer" containerID="c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d" Apr 23 17:57:54.267794 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:57:54.267777 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d\": container with ID starting with c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d not found: ID does not exist" containerID="c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d" Apr 23 17:57:54.267842 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.267799 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d"} err="failed to get container status \"c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d\": rpc error: code = NotFound desc = could not find container \"c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d\": container with ID starting with c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d not found: ID does not exist" Apr 23 17:57:54.267842 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.267814 2576 scope.go:117] "RemoveContainer" containerID="f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7" Apr 23 17:57:54.268061 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:57:54.268045 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7\": container with ID starting with f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7 not found: ID does not exist" containerID="f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7" Apr 23 17:57:54.268106 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.268074 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7"} err="failed to get container status \"f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7\": rpc error: code = NotFound desc = could not find container \"f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7\": container with ID starting with f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7 not found: ID does not exist" Apr 23 17:57:54.268106 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.268092 2576 scope.go:117] "RemoveContainer" containerID="cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8" Apr 23 17:57:54.268287 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:57:54.268272 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8\": container with ID starting with cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8 not found: ID does not exist" containerID="cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8" Apr 23 17:57:54.268326 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.268289 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8"} err="failed to get container status \"cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8\": rpc error: code = NotFound desc = could not find container \"cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8\": container with ID starting with cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8 not found: ID does not exist" Apr 23 17:57:54.268326 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.268310 2576 scope.go:117] "RemoveContainer" containerID="a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442" Apr 23 17:57:54.268502 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.268486 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442"} err="failed to get container status \"a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442\": rpc error: code = NotFound desc = could not find container \"a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442\": container with ID starting with a66e9ba745ae9c2cd9a5d1ce3aa3f2f3df24cd80b5ef4c0c2f12281376c23442 not found: ID does not exist" Apr 23 17:57:54.268543 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.268502 2576 scope.go:117] "RemoveContainer" containerID="6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf" Apr 23 17:57:54.268675 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.268661 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf"} err="failed to get container status \"6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf\": rpc error: code = NotFound desc = could not find container \"6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf\": container with ID starting with 6acf9223ab98a2c98122147cacaeaa5d63e6164a13e622730b6193e2fd2dbacf not found: ID does not exist" Apr 23 17:57:54.268715 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.268675 2576 scope.go:117] "RemoveContainer" containerID="0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c" Apr 23 17:57:54.268886 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.268870 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c"} err="failed to get container status \"0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c\": rpc error: code = NotFound desc = could not find container \"0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c\": container with ID starting with 0d2542d7a74e31f40cf6bf9081d74c89f7f15f74c94927581e18b200b796115c not found: ID does not exist" Apr 23 17:57:54.268940 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.268887 2576 scope.go:117] "RemoveContainer" containerID="9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5" Apr 23 17:57:54.269102 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.269087 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5"} err="failed to get container status \"9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5\": rpc error: code = NotFound desc = could not find container \"9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5\": container with ID starting with 9336c2f9318799d731afbc011d5b274e86fe44351ac7cceff2bf05c2a65207e5 not found: ID does not exist" Apr 23 17:57:54.269102 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.269101 2576 scope.go:117] "RemoveContainer" containerID="c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d" Apr 23 17:57:54.269295 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.269282 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d"} err="failed to get container status \"c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d\": rpc error: code = NotFound desc = could not find container \"c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d\": container with ID starting with c9e058facdd83cef7801bf086b02801ad04a8970a7d1e0987a28b038dd960c2d not found: ID does not exist" Apr 23 17:57:54.269336 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.269295 2576 scope.go:117] "RemoveContainer" containerID="f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7" Apr 23 17:57:54.269476 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.269459 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7"} err="failed to get container status \"f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7\": rpc error: code = NotFound desc = could not find container \"f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7\": container with ID starting with f957cf6ef10198cbc4e8b06b401697995d66decdd8e0a12129bf5e75a820a6f7 not found: ID does not exist" Apr 23 17:57:54.269476 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.269473 2576 scope.go:117] "RemoveContainer" containerID="cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8" Apr 23 17:57:54.269653 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.269633 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8"} err="failed to get container status \"cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8\": rpc error: code = NotFound desc = could not find container \"cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8\": container with ID starting with cdd640ffab431cbc5debcf3f207ff1935cc302005f2435553b50548b5f8834d8 not found: ID does not exist" Apr 23 17:57:54.393856 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:57:54.393781 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" path="/var/lib/kubelet/pods/392b13b6-7939-4aac-8409-2fcb938f87a3/volumes" Apr 23 17:58:02.975779 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:02.975721 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:58:02.976183 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:02.975781 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:58:02.980314 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:02.980291 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:58:03.252568 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:03.252489 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56869659c-xlgm4" Apr 23 17:58:03.302643 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:03.302607 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54567bcbb-7ssnc"] Apr 23 17:58:28.321430 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.321367 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-54567bcbb-7ssnc" podUID="f90c7c1e-5a74-47b2-b56b-e3ec683385dc" containerName="console" containerID="cri-o://769e0589d39a28a6e42c8d5177cd9674f89eba81546d33f38c017e4aad501347" gracePeriod=15 Apr 23 17:58:28.563380 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.563359 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54567bcbb-7ssnc_f90c7c1e-5a74-47b2-b56b-e3ec683385dc/console/0.log" Apr 23 17:58:28.563501 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.563422 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:58:28.631260 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.631223 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-oauth-serving-cert\") pod \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " Apr 23 17:58:28.631449 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.631276 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-trusted-ca-bundle\") pod \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " Apr 23 17:58:28.631520 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.631439 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-oauth-config\") pod \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " Apr 23 17:58:28.631520 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.631480 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-service-ca\") pod \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " Apr 23 17:58:28.631643 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.631520 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk7s8\" (UniqueName: \"kubernetes.io/projected/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-kube-api-access-xk7s8\") pod \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " Apr 23 17:58:28.631643 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.631554 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-serving-cert\") pod \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " Apr 23 17:58:28.631643 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.631579 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-config\") pod \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\" (UID: \"f90c7c1e-5a74-47b2-b56b-e3ec683385dc\") " Apr 23 17:58:28.631643 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.631607 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f90c7c1e-5a74-47b2-b56b-e3ec683385dc" (UID: "f90c7c1e-5a74-47b2-b56b-e3ec683385dc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:58:28.631863 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.631809 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-oauth-serving-cert\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:58:28.631863 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.631704 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f90c7c1e-5a74-47b2-b56b-e3ec683385dc" (UID: "f90c7c1e-5a74-47b2-b56b-e3ec683385dc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:58:28.631941 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.631885 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-service-ca" (OuterVolumeSpecName: "service-ca") pod "f90c7c1e-5a74-47b2-b56b-e3ec683385dc" (UID: "f90c7c1e-5a74-47b2-b56b-e3ec683385dc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:58:28.631989 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.631960 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-config" (OuterVolumeSpecName: "console-config") pod "f90c7c1e-5a74-47b2-b56b-e3ec683385dc" (UID: "f90c7c1e-5a74-47b2-b56b-e3ec683385dc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:58:28.633596 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.633571 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f90c7c1e-5a74-47b2-b56b-e3ec683385dc" (UID: "f90c7c1e-5a74-47b2-b56b-e3ec683385dc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:58:28.633782 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.633761 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f90c7c1e-5a74-47b2-b56b-e3ec683385dc" (UID: "f90c7c1e-5a74-47b2-b56b-e3ec683385dc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:58:28.633953 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.633936 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-kube-api-access-xk7s8" (OuterVolumeSpecName: "kube-api-access-xk7s8") pod "f90c7c1e-5a74-47b2-b56b-e3ec683385dc" (UID: "f90c7c1e-5a74-47b2-b56b-e3ec683385dc"). InnerVolumeSpecName "kube-api-access-xk7s8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:58:28.732472 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.732421 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-trusted-ca-bundle\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:58:28.732472 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.732463 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-oauth-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:58:28.732472 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.732475 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-service-ca\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:58:28.732472 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.732484 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xk7s8\" (UniqueName: \"kubernetes.io/projected/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-kube-api-access-xk7s8\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:58:28.732472 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.732494 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-serving-cert\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:58:28.732808 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:28.732503 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f90c7c1e-5a74-47b2-b56b-e3ec683385dc-console-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 17:58:29.316091 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:29.316067 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54567bcbb-7ssnc_f90c7c1e-5a74-47b2-b56b-e3ec683385dc/console/0.log" Apr 23 17:58:29.316270 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:29.316108 2576 generic.go:358] "Generic (PLEG): container finished" podID="f90c7c1e-5a74-47b2-b56b-e3ec683385dc" containerID="769e0589d39a28a6e42c8d5177cd9674f89eba81546d33f38c017e4aad501347" exitCode=2 Apr 23 17:58:29.316270 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:29.316153 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54567bcbb-7ssnc" event={"ID":"f90c7c1e-5a74-47b2-b56b-e3ec683385dc","Type":"ContainerDied","Data":"769e0589d39a28a6e42c8d5177cd9674f89eba81546d33f38c017e4aad501347"} Apr 23 17:58:29.316270 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:29.316188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54567bcbb-7ssnc" event={"ID":"f90c7c1e-5a74-47b2-b56b-e3ec683385dc","Type":"ContainerDied","Data":"64a6aa8c61476cf0c378e1aaf5dda9216d72fe1a95298dce1e5433d8a4f0871c"} Apr 23 17:58:29.316270 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:29.316212 2576 scope.go:117] "RemoveContainer" containerID="769e0589d39a28a6e42c8d5177cd9674f89eba81546d33f38c017e4aad501347" Apr 23 17:58:29.316270 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:29.316215 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54567bcbb-7ssnc" Apr 23 17:58:29.324514 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:29.324386 2576 scope.go:117] "RemoveContainer" containerID="769e0589d39a28a6e42c8d5177cd9674f89eba81546d33f38c017e4aad501347" Apr 23 17:58:29.324785 ip-10-0-132-102 kubenswrapper[2576]: E0423 17:58:29.324678 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"769e0589d39a28a6e42c8d5177cd9674f89eba81546d33f38c017e4aad501347\": container with ID starting with 769e0589d39a28a6e42c8d5177cd9674f89eba81546d33f38c017e4aad501347 not found: ID does not exist" containerID="769e0589d39a28a6e42c8d5177cd9674f89eba81546d33f38c017e4aad501347" Apr 23 17:58:29.324785 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:29.324707 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"769e0589d39a28a6e42c8d5177cd9674f89eba81546d33f38c017e4aad501347"} err="failed to get container status \"769e0589d39a28a6e42c8d5177cd9674f89eba81546d33f38c017e4aad501347\": rpc error: code = NotFound desc = could not find container \"769e0589d39a28a6e42c8d5177cd9674f89eba81546d33f38c017e4aad501347\": container with ID starting with 769e0589d39a28a6e42c8d5177cd9674f89eba81546d33f38c017e4aad501347 not found: ID does not exist" Apr 23 17:58:29.336719 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:29.336698 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54567bcbb-7ssnc"] Apr 23 17:58:29.341149 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:29.341125 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54567bcbb-7ssnc"] Apr 23 17:58:30.393184 ip-10-0-132-102 kubenswrapper[2576]: I0423 17:58:30.393153 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f90c7c1e-5a74-47b2-b56b-e3ec683385dc" path="/var/lib/kubelet/pods/f90c7c1e-5a74-47b2-b56b-e3ec683385dc/volumes" Apr 23 18:02:10.297736 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:10.297712 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6g56n_ae56a92f-dfae-4763-b849-dca72bc2cf3d/kube-multus/0.log" Apr 23 18:02:10.299877 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:10.299857 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6g56n_ae56a92f-dfae-4763-b849-dca72bc2cf3d/kube-multus/0.log" Apr 23 18:02:23.467547 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467518 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72"] Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467781 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="kube-rbac-proxy-metric" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467793 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="kube-rbac-proxy-metric" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467807 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="init-config-reloader" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467812 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="init-config-reloader" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467818 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="prom-label-proxy" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467824 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="prom-label-proxy" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467830 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="kube-rbac-proxy-web" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467836 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="kube-rbac-proxy-web" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467843 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="alertmanager" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467848 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="alertmanager" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467857 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f90c7c1e-5a74-47b2-b56b-e3ec683385dc" containerName="console" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467862 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f90c7c1e-5a74-47b2-b56b-e3ec683385dc" containerName="console" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467869 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="config-reloader" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467874 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="config-reloader" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467882 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="kube-rbac-proxy" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467887 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="kube-rbac-proxy" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467924 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="alertmanager" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467933 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="config-reloader" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467939 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="kube-rbac-proxy-metric" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467946 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="kube-rbac-proxy-web" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467952 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="kube-rbac-proxy" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467958 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f90c7c1e-5a74-47b2-b56b-e3ec683385dc" containerName="console" Apr 23 18:02:23.469861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.467965 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="392b13b6-7939-4aac-8409-2fcb938f87a3" containerName="prom-label-proxy" Apr 23 18:02:23.470828 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.470814 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:23.472618 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.472591 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-3f299-kube-rbac-proxy-sar-config\"" Apr 23 18:02:23.472924 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.472906 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-9mp96\"" Apr 23 18:02:23.473006 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.472911 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:02:23.473274 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.473257 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-3f299-predictor-serving-cert\"" Apr 23 18:02:23.473328 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.473298 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:02:23.481003 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.480985 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72"] Apr 23 18:02:23.581761 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.581706 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xvds\" (UniqueName: \"kubernetes.io/projected/555b0261-b09e-424b-a4b8-4c3ab5608adb-kube-api-access-5xvds\") pod \"isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:23.581946 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.581778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-raw-sklearn-batcher-3f299-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/555b0261-b09e-424b-a4b8-4c3ab5608adb-isvc-raw-sklearn-batcher-3f299-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:23.581946 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.581809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/555b0261-b09e-424b-a4b8-4c3ab5608adb-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:23.581946 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.581845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/555b0261-b09e-424b-a4b8-4c3ab5608adb-proxy-tls\") pod \"isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:23.682635 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.682603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xvds\" (UniqueName: \"kubernetes.io/projected/555b0261-b09e-424b-a4b8-4c3ab5608adb-kube-api-access-5xvds\") pod \"isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:23.682635 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.682649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-raw-sklearn-batcher-3f299-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/555b0261-b09e-424b-a4b8-4c3ab5608adb-isvc-raw-sklearn-batcher-3f299-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:23.682923 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.682677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/555b0261-b09e-424b-a4b8-4c3ab5608adb-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:23.682923 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.682721 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/555b0261-b09e-424b-a4b8-4c3ab5608adb-proxy-tls\") pod \"isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:23.683138 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.683115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/555b0261-b09e-424b-a4b8-4c3ab5608adb-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:23.683379 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.683353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-raw-sklearn-batcher-3f299-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/555b0261-b09e-424b-a4b8-4c3ab5608adb-isvc-raw-sklearn-batcher-3f299-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:23.685360 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.685325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/555b0261-b09e-424b-a4b8-4c3ab5608adb-proxy-tls\") pod \"isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:23.691512 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.691488 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xvds\" (UniqueName: \"kubernetes.io/projected/555b0261-b09e-424b-a4b8-4c3ab5608adb-kube-api-access-5xvds\") pod \"isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:23.781042 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.780941 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:23.897183 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.897161 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72"] Apr 23 18:02:23.899172 ip-10-0-132-102 kubenswrapper[2576]: W0423 18:02:23.899138 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod555b0261_b09e_424b_a4b8_4c3ab5608adb.slice/crio-f6511736e1af2ffa4159758bc6218cbc0db12d9a06a41088e0f5451a8447254b WatchSource:0}: Error finding container f6511736e1af2ffa4159758bc6218cbc0db12d9a06a41088e0f5451a8447254b: Status 404 returned error can't find the container with id f6511736e1af2ffa4159758bc6218cbc0db12d9a06a41088e0f5451a8447254b Apr 23 18:02:23.900972 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.900957 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:02:23.935836 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:23.935805 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" event={"ID":"555b0261-b09e-424b-a4b8-4c3ab5608adb","Type":"ContainerStarted","Data":"f6511736e1af2ffa4159758bc6218cbc0db12d9a06a41088e0f5451a8447254b"} Apr 23 18:02:27.948950 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:27.948912 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" event={"ID":"555b0261-b09e-424b-a4b8-4c3ab5608adb","Type":"ContainerStarted","Data":"2b140cfeaae01fa501b3a9670d30bd71876fd615e3c828369cd64f361009f155"} Apr 23 18:02:30.959239 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:30.959161 2576 generic.go:358] "Generic (PLEG): container finished" podID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerID="2b140cfeaae01fa501b3a9670d30bd71876fd615e3c828369cd64f361009f155" exitCode=0 Apr 23 18:02:30.959575 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:30.959233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" event={"ID":"555b0261-b09e-424b-a4b8-4c3ab5608adb","Type":"ContainerDied","Data":"2b140cfeaae01fa501b3a9670d30bd71876fd615e3c828369cd64f361009f155"} Apr 23 18:02:44.013488 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:44.013422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" event={"ID":"555b0261-b09e-424b-a4b8-4c3ab5608adb","Type":"ContainerStarted","Data":"abcf7e0370811526fb6c42d24f96f3bd74ab0f4f23809d4fbb2540b1d3d1ef68"} Apr 23 18:02:46.021276 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:46.021244 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" event={"ID":"555b0261-b09e-424b-a4b8-4c3ab5608adb","Type":"ContainerStarted","Data":"54a608826a2208a2cdfa5d38dd975b1e75d000348572af3ab35b90724e37e734"} Apr 23 18:02:48.029175 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:48.029090 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" event={"ID":"555b0261-b09e-424b-a4b8-4c3ab5608adb","Type":"ContainerStarted","Data":"ab1847cd644319248b4d3370da60d8780c8e11a4652af5d115a5a4e7baf46083"} Apr 23 18:02:48.029518 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:48.029270 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:48.049377 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:48.049326 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podStartSLOduration=1.18816684 podStartE2EDuration="25.049313321s" podCreationTimestamp="2026-04-23 18:02:23 +0000 UTC" firstStartedPulling="2026-04-23 18:02:23.901076342 +0000 UTC m=+614.136523022" lastFinishedPulling="2026-04-23 18:02:47.762222819 +0000 UTC m=+637.997669503" observedRunningTime="2026-04-23 18:02:48.047942878 +0000 UTC m=+638.283389584" watchObservedRunningTime="2026-04-23 18:02:48.049313321 +0000 UTC m=+638.284760090" Apr 23 18:02:49.032575 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:49.032529 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:49.032575 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:49.032576 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:49.034007 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:49.033972 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:02:49.034622 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:49.034600 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:02:50.035642 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:50.035602 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:02:50.036119 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:50.036098 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:02:50.039266 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:50.039247 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:02:51.037885 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:51.037844 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:02:51.038322 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:02:51.038160 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:03:01.038154 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:01.038112 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:03:01.038634 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:01.038607 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:03:11.038454 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:11.038397 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:03:11.038868 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:11.038811 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:03:21.038583 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:21.038532 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:03:21.039069 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:21.039025 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:03:31.037997 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:31.037937 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:03:31.038393 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:31.038337 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:03:41.038809 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:41.038734 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:03:41.039289 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:41.039191 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:03:51.038509 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:51.038476 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:03:51.038897 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:51.038579 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:03:58.507398 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.507322 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72"] Apr 23 18:03:58.507871 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.507649 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kserve-container" containerID="cri-o://abcf7e0370811526fb6c42d24f96f3bd74ab0f4f23809d4fbb2540b1d3d1ef68" gracePeriod=30 Apr 23 18:03:58.507871 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.507683 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kube-rbac-proxy" containerID="cri-o://54a608826a2208a2cdfa5d38dd975b1e75d000348572af3ab35b90724e37e734" gracePeriod=30 Apr 23 18:03:58.507871 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.507676 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="agent" containerID="cri-o://ab1847cd644319248b4d3370da60d8780c8e11a4652af5d115a5a4e7baf46083" gracePeriod=30 Apr 23 18:03:58.629192 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.629165 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9"] Apr 23 18:03:58.632456 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.632439 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:03:58.634636 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.634587 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-73769-kube-rbac-proxy-sar-config\"" Apr 23 18:03:58.634636 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.634607 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-73769-predictor-serving-cert\"" Apr 23 18:03:58.644776 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.644724 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9"] Apr 23 18:03:58.707212 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.707184 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf"] Apr 23 18:03:58.710628 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.710611 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:03:58.712771 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.712735 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-73769-kube-rbac-proxy-sar-config\"" Apr 23 18:03:58.712878 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.712732 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-73769-predictor-serving-cert\"" Apr 23 18:03:58.720095 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.720073 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf"] Apr 23 18:03:58.741687 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.741649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:03:58.741805 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.741715 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6fvh\" (UniqueName: \"kubernetes.io/projected/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-kube-api-access-d6fvh\") pod \"isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:03:58.741805 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.741760 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-proxy-tls\") pod \"isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:03:58.741805 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.741799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-73769-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-isvc-sklearn-graph-raw-73769-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:03:58.842413 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.842325 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:03:58.842413 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.842366 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe86fd31-2c47-4aac-869b-15a01eea4604-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:03:58.842413 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.842390 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6fvh\" (UniqueName: \"kubernetes.io/projected/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-kube-api-access-d6fvh\") pod \"isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:03:58.842683 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.842439 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-proxy-tls\") pod \"isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:03:58.842683 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.842482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-73769-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fe86fd31-2c47-4aac-869b-15a01eea4604-isvc-xgboost-graph-raw-73769-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:03:58.842683 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.842532 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-73769-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-isvc-sklearn-graph-raw-73769-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:03:58.842683 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.842567 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n2v5\" (UniqueName: \"kubernetes.io/projected/fe86fd31-2c47-4aac-869b-15a01eea4604-kube-api-access-4n2v5\") pod \"isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:03:58.842683 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.842617 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe86fd31-2c47-4aac-869b-15a01eea4604-proxy-tls\") pod \"isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:03:58.842973 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.842708 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:03:58.843126 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.843108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-73769-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-isvc-sklearn-graph-raw-73769-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:03:58.845022 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.844999 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-proxy-tls\") pod \"isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:03:58.851603 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.851577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6fvh\" (UniqueName: \"kubernetes.io/projected/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-kube-api-access-d6fvh\") pod \"isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:03:58.943077 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.943031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe86fd31-2c47-4aac-869b-15a01eea4604-proxy-tls\") pod \"isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:03:58.943278 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.943096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe86fd31-2c47-4aac-869b-15a01eea4604-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:03:58.943278 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.943144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-73769-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fe86fd31-2c47-4aac-869b-15a01eea4604-isvc-xgboost-graph-raw-73769-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:03:58.943278 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.943187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4n2v5\" (UniqueName: \"kubernetes.io/projected/fe86fd31-2c47-4aac-869b-15a01eea4604-kube-api-access-4n2v5\") pod \"isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:03:58.943278 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:03:58.943190 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-serving-cert: secret "isvc-xgboost-graph-raw-73769-predictor-serving-cert" not found Apr 23 18:03:58.943278 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:03:58.943256 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe86fd31-2c47-4aac-869b-15a01eea4604-proxy-tls podName:fe86fd31-2c47-4aac-869b-15a01eea4604 nodeName:}" failed. No retries permitted until 2026-04-23 18:03:59.443235151 +0000 UTC m=+709.678681842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/fe86fd31-2c47-4aac-869b-15a01eea4604-proxy-tls") pod "isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" (UID: "fe86fd31-2c47-4aac-869b-15a01eea4604") : secret "isvc-xgboost-graph-raw-73769-predictor-serving-cert" not found Apr 23 18:03:58.943607 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.943586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe86fd31-2c47-4aac-869b-15a01eea4604-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:03:58.943863 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.943845 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-73769-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fe86fd31-2c47-4aac-869b-15a01eea4604-isvc-xgboost-graph-raw-73769-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:03:58.944471 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.944445 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:03:58.953229 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:58.953208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n2v5\" (UniqueName: \"kubernetes.io/projected/fe86fd31-2c47-4aac-869b-15a01eea4604-kube-api-access-4n2v5\") pod \"isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:03:59.065696 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:59.065668 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9"] Apr 23 18:03:59.067478 ip-10-0-132-102 kubenswrapper[2576]: W0423 18:03:59.067453 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod972b4d2b_3e63_4c82_b97c_cd3ac27b7bbe.slice/crio-41752776ed12ba832b310eb025230404c52fcc68b8797f62e1e24d9b2274eec7 WatchSource:0}: Error finding container 41752776ed12ba832b310eb025230404c52fcc68b8797f62e1e24d9b2274eec7: Status 404 returned error can't find the container with id 41752776ed12ba832b310eb025230404c52fcc68b8797f62e1e24d9b2274eec7 Apr 23 18:03:59.230277 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:59.230247 2576 generic.go:358] "Generic (PLEG): container finished" podID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerID="54a608826a2208a2cdfa5d38dd975b1e75d000348572af3ab35b90724e37e734" exitCode=2 Apr 23 18:03:59.230469 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:59.230313 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" event={"ID":"555b0261-b09e-424b-a4b8-4c3ab5608adb","Type":"ContainerDied","Data":"54a608826a2208a2cdfa5d38dd975b1e75d000348572af3ab35b90724e37e734"} Apr 23 18:03:59.231587 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:59.231559 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" event={"ID":"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe","Type":"ContainerStarted","Data":"cfefd387c3a20a9e9e24df5ef00b7ae1f32cf5138a49a1730fe62d9909908d21"} Apr 23 18:03:59.231683 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:59.231597 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" event={"ID":"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe","Type":"ContainerStarted","Data":"41752776ed12ba832b310eb025230404c52fcc68b8797f62e1e24d9b2274eec7"} Apr 23 18:03:59.447805 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:59.447769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe86fd31-2c47-4aac-869b-15a01eea4604-proxy-tls\") pod \"isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:03:59.450081 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:59.450064 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe86fd31-2c47-4aac-869b-15a01eea4604-proxy-tls\") pod \"isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:03:59.620501 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:59.620451 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:03:59.734953 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:03:59.734931 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf"] Apr 23 18:03:59.737190 ip-10-0-132-102 kubenswrapper[2576]: W0423 18:03:59.737164 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe86fd31_2c47_4aac_869b_15a01eea4604.slice/crio-c0f5ed25b41bb2116632542fa89bc99c5c653f97f0436da0289c24a97e0d05d0 WatchSource:0}: Error finding container c0f5ed25b41bb2116632542fa89bc99c5c653f97f0436da0289c24a97e0d05d0: Status 404 returned error can't find the container with id c0f5ed25b41bb2116632542fa89bc99c5c653f97f0436da0289c24a97e0d05d0 Apr 23 18:04:00.036552 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:00.036454 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.20:8643/healthz\": dial tcp 10.133.0.20:8643: connect: connection refused" Apr 23 18:04:00.236232 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:00.236195 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" event={"ID":"fe86fd31-2c47-4aac-869b-15a01eea4604","Type":"ContainerStarted","Data":"bdf40818b4c5e670ce79a2f09bb7a49a6c582c6e8d2f94b2fd1e3e2a95061123"} Apr 23 18:04:00.236232 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:00.236236 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" event={"ID":"fe86fd31-2c47-4aac-869b-15a01eea4604","Type":"ContainerStarted","Data":"c0f5ed25b41bb2116632542fa89bc99c5c653f97f0436da0289c24a97e0d05d0"} Apr 23 18:04:01.038368 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:01.038328 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:04:01.038766 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:01.038689 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:04:03.246649 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:03.246617 2576 generic.go:358] "Generic (PLEG): container finished" podID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerID="abcf7e0370811526fb6c42d24f96f3bd74ab0f4f23809d4fbb2540b1d3d1ef68" exitCode=0 Apr 23 18:04:03.247067 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:03.246677 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" event={"ID":"555b0261-b09e-424b-a4b8-4c3ab5608adb","Type":"ContainerDied","Data":"abcf7e0370811526fb6c42d24f96f3bd74ab0f4f23809d4fbb2540b1d3d1ef68"} Apr 23 18:04:03.247874 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:03.247852 2576 generic.go:358] "Generic (PLEG): container finished" podID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerID="cfefd387c3a20a9e9e24df5ef00b7ae1f32cf5138a49a1730fe62d9909908d21" exitCode=0 Apr 23 18:04:03.247924 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:03.247896 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" event={"ID":"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe","Type":"ContainerDied","Data":"cfefd387c3a20a9e9e24df5ef00b7ae1f32cf5138a49a1730fe62d9909908d21"} Apr 23 18:04:04.255287 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:04.255251 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" event={"ID":"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe","Type":"ContainerStarted","Data":"1e724f2bf3b1bd98e1c175eda0dcec76cdc89c01f8f90f0f8852d53b32d9a9a6"} Apr 23 18:04:04.255815 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:04.255298 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" event={"ID":"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe","Type":"ContainerStarted","Data":"38cbf88c9c347b9eeb42da50bf383035fc4436c9a1442aad9be73b52e8031c03"} Apr 23 18:04:04.255815 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:04.255626 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:04:04.255815 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:04.255649 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:04:04.256642 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:04.256622 2576 generic.go:358] "Generic (PLEG): container finished" podID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerID="bdf40818b4c5e670ce79a2f09bb7a49a6c582c6e8d2f94b2fd1e3e2a95061123" exitCode=0 Apr 23 18:04:04.256731 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:04.256660 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" event={"ID":"fe86fd31-2c47-4aac-869b-15a01eea4604","Type":"ContainerDied","Data":"bdf40818b4c5e670ce79a2f09bb7a49a6c582c6e8d2f94b2fd1e3e2a95061123"} Apr 23 18:04:04.257166 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:04.257130 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:04:04.273362 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:04.273321 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" podStartSLOduration=6.27330975 podStartE2EDuration="6.27330975s" podCreationTimestamp="2026-04-23 18:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:04:04.272584331 +0000 UTC m=+714.508031035" watchObservedRunningTime="2026-04-23 18:04:04.27330975 +0000 UTC m=+714.508756453" Apr 23 18:04:05.036463 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:05.036030 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.20:8643/healthz\": dial tcp 10.133.0.20:8643: connect: connection refused" Apr 23 18:04:05.260445 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:05.260404 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:04:10.036029 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:10.035980 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.20:8643/healthz\": dial tcp 10.133.0.20:8643: connect: connection refused" Apr 23 18:04:10.036503 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:10.036178 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:04:10.265439 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:10.265405 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:04:10.266340 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:10.266302 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:04:11.038100 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:11.038053 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:04:11.038558 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:11.038413 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:04:15.035938 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:15.035891 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.20:8643/healthz\": dial tcp 10.133.0.20:8643: connect: connection refused" Apr 23 18:04:20.036854 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:20.036793 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.20:8643/healthz\": dial tcp 10.133.0.20:8643: connect: connection refused" Apr 23 18:04:20.266256 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:20.266218 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:04:21.037894 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:21.037851 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 23 18:04:21.038338 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:21.038002 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:04:21.038338 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:21.038183 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:04:21.038338 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:21.038263 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:04:21.312348 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:21.312260 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" event={"ID":"fe86fd31-2c47-4aac-869b-15a01eea4604","Type":"ContainerStarted","Data":"562f1d0c95ba1c5756e935602c3122c2854d63f30536ce3944aa72e36504194b"} Apr 23 18:04:21.312348 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:21.312300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" event={"ID":"fe86fd31-2c47-4aac-869b-15a01eea4604","Type":"ContainerStarted","Data":"998ecab27d60555639985c5c7f75c0a058947d04b1831ef4f8ef0b15d8fd947f"} Apr 23 18:04:21.312584 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:21.312519 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:04:21.332732 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:21.332681 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" podStartSLOduration=7.059861824 podStartE2EDuration="23.3326658s" podCreationTimestamp="2026-04-23 18:03:58 +0000 UTC" firstStartedPulling="2026-04-23 18:04:04.25797203 +0000 UTC m=+714.493418714" lastFinishedPulling="2026-04-23 18:04:20.530776006 +0000 UTC m=+730.766222690" observedRunningTime="2026-04-23 18:04:21.331886304 +0000 UTC m=+731.567333007" watchObservedRunningTime="2026-04-23 18:04:21.3326658 +0000 UTC m=+731.568112503" Apr 23 18:04:22.316446 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:22.316422 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:04:22.317754 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:22.317713 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 23 18:04:23.319097 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:23.319051 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 23 18:04:25.036124 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:25.036082 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.20:8643/healthz\": dial tcp 10.133.0.20:8643: connect: connection refused" Apr 23 18:04:28.324151 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:28.324125 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:04:28.324575 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:28.324552 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 23 18:04:28.646080 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:28.646051 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:04:28.670264 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:28.670236 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-raw-sklearn-batcher-3f299-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/555b0261-b09e-424b-a4b8-4c3ab5608adb-isvc-raw-sklearn-batcher-3f299-kube-rbac-proxy-sar-config\") pod \"555b0261-b09e-424b-a4b8-4c3ab5608adb\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " Apr 23 18:04:28.670411 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:28.670280 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/555b0261-b09e-424b-a4b8-4c3ab5608adb-kserve-provision-location\") pod \"555b0261-b09e-424b-a4b8-4c3ab5608adb\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " Apr 23 18:04:28.670411 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:28.670298 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xvds\" (UniqueName: \"kubernetes.io/projected/555b0261-b09e-424b-a4b8-4c3ab5608adb-kube-api-access-5xvds\") pod \"555b0261-b09e-424b-a4b8-4c3ab5608adb\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " Apr 23 18:04:28.670502 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:28.670414 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/555b0261-b09e-424b-a4b8-4c3ab5608adb-proxy-tls\") pod \"555b0261-b09e-424b-a4b8-4c3ab5608adb\" (UID: \"555b0261-b09e-424b-a4b8-4c3ab5608adb\") " Apr 23 18:04:28.670658 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:28.670632 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/555b0261-b09e-424b-a4b8-4c3ab5608adb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "555b0261-b09e-424b-a4b8-4c3ab5608adb" (UID: "555b0261-b09e-424b-a4b8-4c3ab5608adb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:04:28.670727 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:28.670714 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/555b0261-b09e-424b-a4b8-4c3ab5608adb-kserve-provision-location\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:04:28.670803 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:28.670642 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555b0261-b09e-424b-a4b8-4c3ab5608adb-isvc-raw-sklearn-batcher-3f299-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-raw-sklearn-batcher-3f299-kube-rbac-proxy-sar-config") pod "555b0261-b09e-424b-a4b8-4c3ab5608adb" (UID: "555b0261-b09e-424b-a4b8-4c3ab5608adb"). InnerVolumeSpecName "isvc-raw-sklearn-batcher-3f299-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:04:28.672468 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:28.672446 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555b0261-b09e-424b-a4b8-4c3ab5608adb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "555b0261-b09e-424b-a4b8-4c3ab5608adb" (UID: "555b0261-b09e-424b-a4b8-4c3ab5608adb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:04:28.672545 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:28.672521 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/555b0261-b09e-424b-a4b8-4c3ab5608adb-kube-api-access-5xvds" (OuterVolumeSpecName: "kube-api-access-5xvds") pod "555b0261-b09e-424b-a4b8-4c3ab5608adb" (UID: "555b0261-b09e-424b-a4b8-4c3ab5608adb"). InnerVolumeSpecName "kube-api-access-5xvds". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:04:28.771060 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:28.771021 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/555b0261-b09e-424b-a4b8-4c3ab5608adb-proxy-tls\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:04:28.771060 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:28.771050 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-raw-sklearn-batcher-3f299-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/555b0261-b09e-424b-a4b8-4c3ab5608adb-isvc-raw-sklearn-batcher-3f299-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:04:28.771060 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:28.771063 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5xvds\" (UniqueName: \"kubernetes.io/projected/555b0261-b09e-424b-a4b8-4c3ab5608adb-kube-api-access-5xvds\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:04:29.336853 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.336819 2576 generic.go:358] "Generic (PLEG): container finished" podID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerID="ab1847cd644319248b4d3370da60d8780c8e11a4652af5d115a5a4e7baf46083" exitCode=0 Apr 23 18:04:29.337268 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.336898 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" event={"ID":"555b0261-b09e-424b-a4b8-4c3ab5608adb","Type":"ContainerDied","Data":"ab1847cd644319248b4d3370da60d8780c8e11a4652af5d115a5a4e7baf46083"} Apr 23 18:04:29.337268 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.336912 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" Apr 23 18:04:29.337268 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.336924 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72" event={"ID":"555b0261-b09e-424b-a4b8-4c3ab5608adb","Type":"ContainerDied","Data":"f6511736e1af2ffa4159758bc6218cbc0db12d9a06a41088e0f5451a8447254b"} Apr 23 18:04:29.337268 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.336940 2576 scope.go:117] "RemoveContainer" containerID="ab1847cd644319248b4d3370da60d8780c8e11a4652af5d115a5a4e7baf46083" Apr 23 18:04:29.344433 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.344416 2576 scope.go:117] "RemoveContainer" containerID="54a608826a2208a2cdfa5d38dd975b1e75d000348572af3ab35b90724e37e734" Apr 23 18:04:29.351318 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.351300 2576 scope.go:117] "RemoveContainer" containerID="abcf7e0370811526fb6c42d24f96f3bd74ab0f4f23809d4fbb2540b1d3d1ef68" Apr 23 18:04:29.358535 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.358517 2576 scope.go:117] "RemoveContainer" containerID="2b140cfeaae01fa501b3a9670d30bd71876fd615e3c828369cd64f361009f155" Apr 23 18:04:29.359376 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.359352 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72"] Apr 23 18:04:29.360917 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.360891 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3f299-predictor-5d54bcdf87-chc72"] Apr 23 18:04:29.365713 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.365695 2576 scope.go:117] "RemoveContainer" containerID="ab1847cd644319248b4d3370da60d8780c8e11a4652af5d115a5a4e7baf46083" Apr 23 18:04:29.366002 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:04:29.365981 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab1847cd644319248b4d3370da60d8780c8e11a4652af5d115a5a4e7baf46083\": container with ID starting with ab1847cd644319248b4d3370da60d8780c8e11a4652af5d115a5a4e7baf46083 not found: ID does not exist" containerID="ab1847cd644319248b4d3370da60d8780c8e11a4652af5d115a5a4e7baf46083" Apr 23 18:04:29.366049 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.366010 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab1847cd644319248b4d3370da60d8780c8e11a4652af5d115a5a4e7baf46083"} err="failed to get container status \"ab1847cd644319248b4d3370da60d8780c8e11a4652af5d115a5a4e7baf46083\": rpc error: code = NotFound desc = could not find container \"ab1847cd644319248b4d3370da60d8780c8e11a4652af5d115a5a4e7baf46083\": container with ID starting with ab1847cd644319248b4d3370da60d8780c8e11a4652af5d115a5a4e7baf46083 not found: ID does not exist" Apr 23 18:04:29.366049 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.366028 2576 scope.go:117] "RemoveContainer" containerID="54a608826a2208a2cdfa5d38dd975b1e75d000348572af3ab35b90724e37e734" Apr 23 18:04:29.366274 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:04:29.366258 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54a608826a2208a2cdfa5d38dd975b1e75d000348572af3ab35b90724e37e734\": container with ID starting with 54a608826a2208a2cdfa5d38dd975b1e75d000348572af3ab35b90724e37e734 not found: ID does not exist" containerID="54a608826a2208a2cdfa5d38dd975b1e75d000348572af3ab35b90724e37e734" Apr 23 18:04:29.366321 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.366278 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54a608826a2208a2cdfa5d38dd975b1e75d000348572af3ab35b90724e37e734"} err="failed to get container status \"54a608826a2208a2cdfa5d38dd975b1e75d000348572af3ab35b90724e37e734\": rpc error: code = NotFound desc = could not find container \"54a608826a2208a2cdfa5d38dd975b1e75d000348572af3ab35b90724e37e734\": container with ID starting with 54a608826a2208a2cdfa5d38dd975b1e75d000348572af3ab35b90724e37e734 not found: ID does not exist" Apr 23 18:04:29.366321 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.366290 2576 scope.go:117] "RemoveContainer" containerID="abcf7e0370811526fb6c42d24f96f3bd74ab0f4f23809d4fbb2540b1d3d1ef68" Apr 23 18:04:29.366503 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:04:29.366488 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abcf7e0370811526fb6c42d24f96f3bd74ab0f4f23809d4fbb2540b1d3d1ef68\": container with ID starting with abcf7e0370811526fb6c42d24f96f3bd74ab0f4f23809d4fbb2540b1d3d1ef68 not found: ID does not exist" containerID="abcf7e0370811526fb6c42d24f96f3bd74ab0f4f23809d4fbb2540b1d3d1ef68" Apr 23 18:04:29.366543 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.366507 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abcf7e0370811526fb6c42d24f96f3bd74ab0f4f23809d4fbb2540b1d3d1ef68"} err="failed to get container status \"abcf7e0370811526fb6c42d24f96f3bd74ab0f4f23809d4fbb2540b1d3d1ef68\": rpc error: code = NotFound desc = could not find container \"abcf7e0370811526fb6c42d24f96f3bd74ab0f4f23809d4fbb2540b1d3d1ef68\": container with ID starting with abcf7e0370811526fb6c42d24f96f3bd74ab0f4f23809d4fbb2540b1d3d1ef68 not found: ID does not exist" Apr 23 18:04:29.366543 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.366518 2576 scope.go:117] "RemoveContainer" containerID="2b140cfeaae01fa501b3a9670d30bd71876fd615e3c828369cd64f361009f155" Apr 23 18:04:29.366715 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:04:29.366699 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b140cfeaae01fa501b3a9670d30bd71876fd615e3c828369cd64f361009f155\": container with ID starting with 2b140cfeaae01fa501b3a9670d30bd71876fd615e3c828369cd64f361009f155 not found: ID does not exist" containerID="2b140cfeaae01fa501b3a9670d30bd71876fd615e3c828369cd64f361009f155" Apr 23 18:04:29.366772 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:29.366719 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b140cfeaae01fa501b3a9670d30bd71876fd615e3c828369cd64f361009f155"} err="failed to get container status \"2b140cfeaae01fa501b3a9670d30bd71876fd615e3c828369cd64f361009f155\": rpc error: code = NotFound desc = could not find container \"2b140cfeaae01fa501b3a9670d30bd71876fd615e3c828369cd64f361009f155\": container with ID starting with 2b140cfeaae01fa501b3a9670d30bd71876fd615e3c828369cd64f361009f155 not found: ID does not exist" Apr 23 18:04:30.268976 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:30.268896 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:04:30.393035 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:30.392999 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" path="/var/lib/kubelet/pods/555b0261-b09e-424b-a4b8-4c3ab5608adb/volumes" Apr 23 18:04:38.325246 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:38.325200 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 23 18:04:40.266989 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:40.266949 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:04:48.325340 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:48.325302 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 23 18:04:50.267029 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:50.266992 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:04:58.325366 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:04:58.325325 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 23 18:05:00.267039 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:00.267003 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:05:08.325457 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:08.325413 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 23 18:05:10.267130 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:10.267101 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:05:18.325336 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:18.325290 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 23 18:05:28.325924 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:28.325832 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:05:38.856137 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.856107 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9"] Apr 23 18:05:38.856564 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.856448 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kserve-container" containerID="cri-o://38cbf88c9c347b9eeb42da50bf383035fc4436c9a1442aad9be73b52e8031c03" gracePeriod=30 Apr 23 18:05:38.856564 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.856507 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kube-rbac-proxy" containerID="cri-o://1e724f2bf3b1bd98e1c175eda0dcec76cdc89c01f8f90f0f8852d53b32d9a9a6" gracePeriod=30 Apr 23 18:05:38.904698 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.904667 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt"] Apr 23 18:05:38.904991 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.904975 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kube-rbac-proxy" Apr 23 18:05:38.904991 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.904991 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kube-rbac-proxy" Apr 23 18:05:38.905135 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.905015 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kserve-container" Apr 23 18:05:38.905135 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.905021 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kserve-container" Apr 23 18:05:38.905135 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.905028 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="agent" Apr 23 18:05:38.905135 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.905034 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="agent" Apr 23 18:05:38.905135 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.905049 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="storage-initializer" Apr 23 18:05:38.905135 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.905055 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="storage-initializer" Apr 23 18:05:38.905135 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.905106 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kube-rbac-proxy" Apr 23 18:05:38.905135 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.905113 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="kserve-container" Apr 23 18:05:38.905135 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.905123 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="555b0261-b09e-424b-a4b8-4c3ab5608adb" containerName="agent" Apr 23 18:05:38.907164 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.907144 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:38.910138 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.910112 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\"" Apr 23 18:05:38.910248 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.910231 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-050b4-predictor-serving-cert\"" Apr 23 18:05:38.924432 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:38.924413 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt"] Apr 23 18:05:39.009130 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.009102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38a8f284-3466-4635-8eed-34e77cafee06-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:39.009259 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.009139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38a8f284-3466-4635-8eed-34e77cafee06-isvc-sklearn-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:39.009259 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.009170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2scfw\" (UniqueName: \"kubernetes.io/projected/38a8f284-3466-4635-8eed-34e77cafee06-kube-api-access-2scfw\") pod \"isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:39.009345 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.009277 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38a8f284-3466-4635-8eed-34e77cafee06-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:39.059869 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.059839 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf"] Apr 23 18:05:39.060140 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.060113 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kserve-container" containerID="cri-o://998ecab27d60555639985c5c7f75c0a058947d04b1831ef4f8ef0b15d8fd947f" gracePeriod=30 Apr 23 18:05:39.060221 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.060156 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kube-rbac-proxy" containerID="cri-o://562f1d0c95ba1c5756e935602c3122c2854d63f30536ce3944aa72e36504194b" gracePeriod=30 Apr 23 18:05:39.070376 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.070351 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x"] Apr 23 18:05:39.072677 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.072660 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:39.075016 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.074988 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-050b4-predictor-serving-cert\"" Apr 23 18:05:39.075107 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.075065 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\"" Apr 23 18:05:39.083302 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.083283 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x"] Apr 23 18:05:39.110105 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.110031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38a8f284-3466-4635-8eed-34e77cafee06-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:39.110105 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.110066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38a8f284-3466-4635-8eed-34e77cafee06-isvc-sklearn-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:39.110293 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.110105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2scfw\" (UniqueName: \"kubernetes.io/projected/38a8f284-3466-4635-8eed-34e77cafee06-kube-api-access-2scfw\") pod \"isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:39.110293 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.110143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38a8f284-3466-4635-8eed-34e77cafee06-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:39.110293 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:05:39.110182 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-serving-cert: secret "isvc-sklearn-graph-raw-hpa-050b4-predictor-serving-cert" not found Apr 23 18:05:39.110293 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:05:39.110261 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38a8f284-3466-4635-8eed-34e77cafee06-proxy-tls podName:38a8f284-3466-4635-8eed-34e77cafee06 nodeName:}" failed. No retries permitted until 2026-04-23 18:05:39.610240056 +0000 UTC m=+809.845686753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/38a8f284-3466-4635-8eed-34e77cafee06-proxy-tls") pod "isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" (UID: "38a8f284-3466-4635-8eed-34e77cafee06") : secret "isvc-sklearn-graph-raw-hpa-050b4-predictor-serving-cert" not found Apr 23 18:05:39.110623 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.110605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38a8f284-3466-4635-8eed-34e77cafee06-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:39.110771 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.110736 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38a8f284-3466-4635-8eed-34e77cafee06-isvc-sklearn-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:39.118152 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.118123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2scfw\" (UniqueName: \"kubernetes.io/projected/38a8f284-3466-4635-8eed-34e77cafee06-kube-api-access-2scfw\") pod \"isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:39.210871 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.210841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12caa729-dcf3-4776-aa94-964438c08d3e-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:39.210970 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.210878 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12caa729-dcf3-4776-aa94-964438c08d3e-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:39.210970 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.210920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7x77\" (UniqueName: \"kubernetes.io/projected/12caa729-dcf3-4776-aa94-964438c08d3e-kube-api-access-c7x77\") pod \"isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:39.210970 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.210941 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/12caa729-dcf3-4776-aa94-964438c08d3e-isvc-xgboost-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:39.312291 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.312259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12caa729-dcf3-4776-aa94-964438c08d3e-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:39.312291 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.312295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12caa729-dcf3-4776-aa94-964438c08d3e-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:39.312495 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.312331 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7x77\" (UniqueName: \"kubernetes.io/projected/12caa729-dcf3-4776-aa94-964438c08d3e-kube-api-access-c7x77\") pod \"isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:39.312495 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.312357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/12caa729-dcf3-4776-aa94-964438c08d3e-isvc-xgboost-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:39.312759 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.312713 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12caa729-dcf3-4776-aa94-964438c08d3e-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:39.313070 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.313049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/12caa729-dcf3-4776-aa94-964438c08d3e-isvc-xgboost-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:39.314836 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.314810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12caa729-dcf3-4776-aa94-964438c08d3e-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:39.320708 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.320683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7x77\" (UniqueName: \"kubernetes.io/projected/12caa729-dcf3-4776-aa94-964438c08d3e-kube-api-access-c7x77\") pod \"isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:39.382782 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.382753 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:39.502000 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.501976 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x"] Apr 23 18:05:39.504106 ip-10-0-132-102 kubenswrapper[2576]: W0423 18:05:39.504081 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12caa729_dcf3_4776_aa94_964438c08d3e.slice/crio-b7d28ef1d3ab92c66f8f8892f2675390410620341e09e593b3bc9f67888855b9 WatchSource:0}: Error finding container b7d28ef1d3ab92c66f8f8892f2675390410620341e09e593b3bc9f67888855b9: Status 404 returned error can't find the container with id b7d28ef1d3ab92c66f8f8892f2675390410620341e09e593b3bc9f67888855b9 Apr 23 18:05:39.525531 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.525498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" event={"ID":"12caa729-dcf3-4776-aa94-964438c08d3e","Type":"ContainerStarted","Data":"b7d28ef1d3ab92c66f8f8892f2675390410620341e09e593b3bc9f67888855b9"} Apr 23 18:05:39.527261 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.527239 2576 generic.go:358] "Generic (PLEG): container finished" podID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerID="1e724f2bf3b1bd98e1c175eda0dcec76cdc89c01f8f90f0f8852d53b32d9a9a6" exitCode=2 Apr 23 18:05:39.527343 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.527308 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" event={"ID":"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe","Type":"ContainerDied","Data":"1e724f2bf3b1bd98e1c175eda0dcec76cdc89c01f8f90f0f8852d53b32d9a9a6"} Apr 23 18:05:39.529069 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.529051 2576 generic.go:358] "Generic (PLEG): container finished" podID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerID="562f1d0c95ba1c5756e935602c3122c2854d63f30536ce3944aa72e36504194b" exitCode=2 Apr 23 18:05:39.529133 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.529106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" event={"ID":"fe86fd31-2c47-4aac-869b-15a01eea4604","Type":"ContainerDied","Data":"562f1d0c95ba1c5756e935602c3122c2854d63f30536ce3944aa72e36504194b"} Apr 23 18:05:39.614292 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.614256 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38a8f284-3466-4635-8eed-34e77cafee06-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:39.616858 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.616820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38a8f284-3466-4635-8eed-34e77cafee06-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:39.819187 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.819095 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:39.959778 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:39.959416 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt"] Apr 23 18:05:39.962821 ip-10-0-132-102 kubenswrapper[2576]: W0423 18:05:39.962788 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38a8f284_3466_4635_8eed_34e77cafee06.slice/crio-02e7f81d06992d7fe5df60b777d3c5b98e482633dbf7b9a5ae867f2dac6e41aa WatchSource:0}: Error finding container 02e7f81d06992d7fe5df60b777d3c5b98e482633dbf7b9a5ae867f2dac6e41aa: Status 404 returned error can't find the container with id 02e7f81d06992d7fe5df60b777d3c5b98e482633dbf7b9a5ae867f2dac6e41aa Apr 23 18:05:40.260907 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:40.260866 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 23 18:05:40.266861 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:40.266835 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 23 18:05:40.533392 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:40.533300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" event={"ID":"12caa729-dcf3-4776-aa94-964438c08d3e","Type":"ContainerStarted","Data":"7f29cad094197d4d7fdeb8e2e80bd9ad80dcc3de40bc4c96e9a27bab328b45b3"} Apr 23 18:05:40.534584 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:40.534557 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" event={"ID":"38a8f284-3466-4635-8eed-34e77cafee06","Type":"ContainerStarted","Data":"871b3727dd105c41e141f7fdc20a9c1def58bd5791d881a2fbc7019bb341ae7f"} Apr 23 18:05:40.534718 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:40.534590 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" event={"ID":"38a8f284-3466-4635-8eed-34e77cafee06","Type":"ContainerStarted","Data":"02e7f81d06992d7fe5df60b777d3c5b98e482633dbf7b9a5ae867f2dac6e41aa"} Apr 23 18:05:42.906919 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:42.906893 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:05:43.041639 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.041544 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe86fd31-2c47-4aac-869b-15a01eea4604-proxy-tls\") pod \"fe86fd31-2c47-4aac-869b-15a01eea4604\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " Apr 23 18:05:43.041639 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.041625 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe86fd31-2c47-4aac-869b-15a01eea4604-kserve-provision-location\") pod \"fe86fd31-2c47-4aac-869b-15a01eea4604\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " Apr 23 18:05:43.041870 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.041654 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n2v5\" (UniqueName: \"kubernetes.io/projected/fe86fd31-2c47-4aac-869b-15a01eea4604-kube-api-access-4n2v5\") pod \"fe86fd31-2c47-4aac-869b-15a01eea4604\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " Apr 23 18:05:43.041870 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.041680 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-73769-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fe86fd31-2c47-4aac-869b-15a01eea4604-isvc-xgboost-graph-raw-73769-kube-rbac-proxy-sar-config\") pod \"fe86fd31-2c47-4aac-869b-15a01eea4604\" (UID: \"fe86fd31-2c47-4aac-869b-15a01eea4604\") " Apr 23 18:05:43.042022 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.042001 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe86fd31-2c47-4aac-869b-15a01eea4604-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fe86fd31-2c47-4aac-869b-15a01eea4604" (UID: "fe86fd31-2c47-4aac-869b-15a01eea4604"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:05:43.042132 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.042106 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe86fd31-2c47-4aac-869b-15a01eea4604-isvc-xgboost-graph-raw-73769-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-73769-kube-rbac-proxy-sar-config") pod "fe86fd31-2c47-4aac-869b-15a01eea4604" (UID: "fe86fd31-2c47-4aac-869b-15a01eea4604"). InnerVolumeSpecName "isvc-xgboost-graph-raw-73769-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:05:43.043790 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.043769 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe86fd31-2c47-4aac-869b-15a01eea4604-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fe86fd31-2c47-4aac-869b-15a01eea4604" (UID: "fe86fd31-2c47-4aac-869b-15a01eea4604"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:05:43.043942 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.043920 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe86fd31-2c47-4aac-869b-15a01eea4604-kube-api-access-4n2v5" (OuterVolumeSpecName: "kube-api-access-4n2v5") pod "fe86fd31-2c47-4aac-869b-15a01eea4604" (UID: "fe86fd31-2c47-4aac-869b-15a01eea4604"). InnerVolumeSpecName "kube-api-access-4n2v5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:05:43.142518 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.142477 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe86fd31-2c47-4aac-869b-15a01eea4604-kserve-provision-location\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:05:43.142518 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.142509 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4n2v5\" (UniqueName: \"kubernetes.io/projected/fe86fd31-2c47-4aac-869b-15a01eea4604-kube-api-access-4n2v5\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:05:43.142518 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.142520 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-73769-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fe86fd31-2c47-4aac-869b-15a01eea4604-isvc-xgboost-graph-raw-73769-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:05:43.142793 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.142532 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe86fd31-2c47-4aac-869b-15a01eea4604-proxy-tls\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:05:43.389189 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.389157 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:05:43.544774 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.544725 2576 generic.go:358] "Generic (PLEG): container finished" podID="38a8f284-3466-4635-8eed-34e77cafee06" containerID="871b3727dd105c41e141f7fdc20a9c1def58bd5791d881a2fbc7019bb341ae7f" exitCode=0 Apr 23 18:05:43.544962 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.544803 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" event={"ID":"38a8f284-3466-4635-8eed-34e77cafee06","Type":"ContainerDied","Data":"871b3727dd105c41e141f7fdc20a9c1def58bd5791d881a2fbc7019bb341ae7f"} Apr 23 18:05:43.545279 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.545252 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-73769-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-isvc-sklearn-graph-raw-73769-kube-rbac-proxy-sar-config\") pod \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " Apr 23 18:05:43.545414 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.545303 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6fvh\" (UniqueName: \"kubernetes.io/projected/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-kube-api-access-d6fvh\") pod \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " Apr 23 18:05:43.545414 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.545341 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-proxy-tls\") pod \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " Apr 23 18:05:43.545598 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.545414 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-kserve-provision-location\") pod \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\" (UID: \"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe\") " Apr 23 18:05:43.545731 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.545627 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-isvc-sklearn-graph-raw-73769-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-73769-kube-rbac-proxy-sar-config") pod "972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" (UID: "972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe"). InnerVolumeSpecName "isvc-sklearn-graph-raw-73769-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:05:43.545976 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.545894 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" (UID: "972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:05:43.546323 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.546298 2576 generic.go:358] "Generic (PLEG): container finished" podID="12caa729-dcf3-4776-aa94-964438c08d3e" containerID="7f29cad094197d4d7fdeb8e2e80bd9ad80dcc3de40bc4c96e9a27bab328b45b3" exitCode=0 Apr 23 18:05:43.546422 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.546340 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" event={"ID":"12caa729-dcf3-4776-aa94-964438c08d3e","Type":"ContainerDied","Data":"7f29cad094197d4d7fdeb8e2e80bd9ad80dcc3de40bc4c96e9a27bab328b45b3"} Apr 23 18:05:43.547926 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.547899 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-kube-api-access-d6fvh" (OuterVolumeSpecName: "kube-api-access-d6fvh") pod "972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" (UID: "972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe"). InnerVolumeSpecName "kube-api-access-d6fvh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:05:43.548057 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.547925 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" (UID: "972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:05:43.548576 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.548557 2576 generic.go:358] "Generic (PLEG): container finished" podID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerID="38cbf88c9c347b9eeb42da50bf383035fc4436c9a1442aad9be73b52e8031c03" exitCode=0 Apr 23 18:05:43.548671 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.548640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" event={"ID":"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe","Type":"ContainerDied","Data":"38cbf88c9c347b9eeb42da50bf383035fc4436c9a1442aad9be73b52e8031c03"} Apr 23 18:05:43.548671 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.548669 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" event={"ID":"972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe","Type":"ContainerDied","Data":"41752776ed12ba832b310eb025230404c52fcc68b8797f62e1e24d9b2274eec7"} Apr 23 18:05:43.548804 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.548645 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9" Apr 23 18:05:43.548804 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.548690 2576 scope.go:117] "RemoveContainer" containerID="1e724f2bf3b1bd98e1c175eda0dcec76cdc89c01f8f90f0f8852d53b32d9a9a6" Apr 23 18:05:43.550716 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.550690 2576 generic.go:358] "Generic (PLEG): container finished" podID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerID="998ecab27d60555639985c5c7f75c0a058947d04b1831ef4f8ef0b15d8fd947f" exitCode=0 Apr 23 18:05:43.550829 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.550775 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" Apr 23 18:05:43.550829 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.550786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" event={"ID":"fe86fd31-2c47-4aac-869b-15a01eea4604","Type":"ContainerDied","Data":"998ecab27d60555639985c5c7f75c0a058947d04b1831ef4f8ef0b15d8fd947f"} Apr 23 18:05:43.550829 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.550814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf" event={"ID":"fe86fd31-2c47-4aac-869b-15a01eea4604","Type":"ContainerDied","Data":"c0f5ed25b41bb2116632542fa89bc99c5c653f97f0436da0289c24a97e0d05d0"} Apr 23 18:05:43.558094 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.558058 2576 scope.go:117] "RemoveContainer" containerID="38cbf88c9c347b9eeb42da50bf383035fc4436c9a1442aad9be73b52e8031c03" Apr 23 18:05:43.567587 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.567558 2576 scope.go:117] "RemoveContainer" containerID="cfefd387c3a20a9e9e24df5ef00b7ae1f32cf5138a49a1730fe62d9909908d21" Apr 23 18:05:43.574915 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.574895 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9"] Apr 23 18:05:43.579421 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.578983 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-73769-predictor-57df48d995-j4hj9"] Apr 23 18:05:43.579421 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.579236 2576 scope.go:117] "RemoveContainer" containerID="1e724f2bf3b1bd98e1c175eda0dcec76cdc89c01f8f90f0f8852d53b32d9a9a6" Apr 23 18:05:43.579729 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:05:43.579703 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e724f2bf3b1bd98e1c175eda0dcec76cdc89c01f8f90f0f8852d53b32d9a9a6\": container with ID starting with 1e724f2bf3b1bd98e1c175eda0dcec76cdc89c01f8f90f0f8852d53b32d9a9a6 not found: ID does not exist" containerID="1e724f2bf3b1bd98e1c175eda0dcec76cdc89c01f8f90f0f8852d53b32d9a9a6" Apr 23 18:05:43.579814 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.579760 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e724f2bf3b1bd98e1c175eda0dcec76cdc89c01f8f90f0f8852d53b32d9a9a6"} err="failed to get container status \"1e724f2bf3b1bd98e1c175eda0dcec76cdc89c01f8f90f0f8852d53b32d9a9a6\": rpc error: code = NotFound desc = could not find container \"1e724f2bf3b1bd98e1c175eda0dcec76cdc89c01f8f90f0f8852d53b32d9a9a6\": container with ID starting with 1e724f2bf3b1bd98e1c175eda0dcec76cdc89c01f8f90f0f8852d53b32d9a9a6 not found: ID does not exist" Apr 23 18:05:43.579814 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.579786 2576 scope.go:117] "RemoveContainer" containerID="38cbf88c9c347b9eeb42da50bf383035fc4436c9a1442aad9be73b52e8031c03" Apr 23 18:05:43.580103 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:05:43.580074 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38cbf88c9c347b9eeb42da50bf383035fc4436c9a1442aad9be73b52e8031c03\": container with ID starting with 38cbf88c9c347b9eeb42da50bf383035fc4436c9a1442aad9be73b52e8031c03 not found: ID does not exist" containerID="38cbf88c9c347b9eeb42da50bf383035fc4436c9a1442aad9be73b52e8031c03" Apr 23 18:05:43.580194 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.580112 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38cbf88c9c347b9eeb42da50bf383035fc4436c9a1442aad9be73b52e8031c03"} err="failed to get container status \"38cbf88c9c347b9eeb42da50bf383035fc4436c9a1442aad9be73b52e8031c03\": rpc error: code = NotFound desc = could not find container \"38cbf88c9c347b9eeb42da50bf383035fc4436c9a1442aad9be73b52e8031c03\": container with ID starting with 38cbf88c9c347b9eeb42da50bf383035fc4436c9a1442aad9be73b52e8031c03 not found: ID does not exist" Apr 23 18:05:43.580194 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.580135 2576 scope.go:117] "RemoveContainer" containerID="cfefd387c3a20a9e9e24df5ef00b7ae1f32cf5138a49a1730fe62d9909908d21" Apr 23 18:05:43.580453 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:05:43.580427 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfefd387c3a20a9e9e24df5ef00b7ae1f32cf5138a49a1730fe62d9909908d21\": container with ID starting with cfefd387c3a20a9e9e24df5ef00b7ae1f32cf5138a49a1730fe62d9909908d21 not found: ID does not exist" containerID="cfefd387c3a20a9e9e24df5ef00b7ae1f32cf5138a49a1730fe62d9909908d21" Apr 23 18:05:43.580502 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.580465 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfefd387c3a20a9e9e24df5ef00b7ae1f32cf5138a49a1730fe62d9909908d21"} err="failed to get container status \"cfefd387c3a20a9e9e24df5ef00b7ae1f32cf5138a49a1730fe62d9909908d21\": rpc error: code = NotFound desc = could not find container \"cfefd387c3a20a9e9e24df5ef00b7ae1f32cf5138a49a1730fe62d9909908d21\": container with ID starting with cfefd387c3a20a9e9e24df5ef00b7ae1f32cf5138a49a1730fe62d9909908d21 not found: ID does not exist" Apr 23 18:05:43.580502 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.580488 2576 scope.go:117] "RemoveContainer" containerID="562f1d0c95ba1c5756e935602c3122c2854d63f30536ce3944aa72e36504194b" Apr 23 18:05:43.590220 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.590200 2576 scope.go:117] "RemoveContainer" containerID="998ecab27d60555639985c5c7f75c0a058947d04b1831ef4f8ef0b15d8fd947f" Apr 23 18:05:43.598923 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.598896 2576 scope.go:117] "RemoveContainer" containerID="bdf40818b4c5e670ce79a2f09bb7a49a6c582c6e8d2f94b2fd1e3e2a95061123" Apr 23 18:05:43.606392 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.606365 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf"] Apr 23 18:05:43.607466 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.607448 2576 scope.go:117] "RemoveContainer" containerID="562f1d0c95ba1c5756e935602c3122c2854d63f30536ce3944aa72e36504194b" Apr 23 18:05:43.607778 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:05:43.607754 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"562f1d0c95ba1c5756e935602c3122c2854d63f30536ce3944aa72e36504194b\": container with ID starting with 562f1d0c95ba1c5756e935602c3122c2854d63f30536ce3944aa72e36504194b not found: ID does not exist" containerID="562f1d0c95ba1c5756e935602c3122c2854d63f30536ce3944aa72e36504194b" Apr 23 18:05:43.607869 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.607785 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"562f1d0c95ba1c5756e935602c3122c2854d63f30536ce3944aa72e36504194b"} err="failed to get container status \"562f1d0c95ba1c5756e935602c3122c2854d63f30536ce3944aa72e36504194b\": rpc error: code = NotFound desc = could not find container \"562f1d0c95ba1c5756e935602c3122c2854d63f30536ce3944aa72e36504194b\": container with ID starting with 562f1d0c95ba1c5756e935602c3122c2854d63f30536ce3944aa72e36504194b not found: ID does not exist" Apr 23 18:05:43.607869 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.607806 2576 scope.go:117] "RemoveContainer" containerID="998ecab27d60555639985c5c7f75c0a058947d04b1831ef4f8ef0b15d8fd947f" Apr 23 18:05:43.608074 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:05:43.608055 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"998ecab27d60555639985c5c7f75c0a058947d04b1831ef4f8ef0b15d8fd947f\": container with ID starting with 998ecab27d60555639985c5c7f75c0a058947d04b1831ef4f8ef0b15d8fd947f not found: ID does not exist" containerID="998ecab27d60555639985c5c7f75c0a058947d04b1831ef4f8ef0b15d8fd947f" Apr 23 18:05:43.608151 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.608083 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"998ecab27d60555639985c5c7f75c0a058947d04b1831ef4f8ef0b15d8fd947f"} err="failed to get container status \"998ecab27d60555639985c5c7f75c0a058947d04b1831ef4f8ef0b15d8fd947f\": rpc error: code = NotFound desc = could not find container \"998ecab27d60555639985c5c7f75c0a058947d04b1831ef4f8ef0b15d8fd947f\": container with ID starting with 998ecab27d60555639985c5c7f75c0a058947d04b1831ef4f8ef0b15d8fd947f not found: ID does not exist" Apr 23 18:05:43.608151 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.608106 2576 scope.go:117] "RemoveContainer" containerID="bdf40818b4c5e670ce79a2f09bb7a49a6c582c6e8d2f94b2fd1e3e2a95061123" Apr 23 18:05:43.608475 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:05:43.608448 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdf40818b4c5e670ce79a2f09bb7a49a6c582c6e8d2f94b2fd1e3e2a95061123\": container with ID starting with bdf40818b4c5e670ce79a2f09bb7a49a6c582c6e8d2f94b2fd1e3e2a95061123 not found: ID does not exist" containerID="bdf40818b4c5e670ce79a2f09bb7a49a6c582c6e8d2f94b2fd1e3e2a95061123" Apr 23 18:05:43.608563 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.608481 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf40818b4c5e670ce79a2f09bb7a49a6c582c6e8d2f94b2fd1e3e2a95061123"} err="failed to get container status \"bdf40818b4c5e670ce79a2f09bb7a49a6c582c6e8d2f94b2fd1e3e2a95061123\": rpc error: code = NotFound desc = could not find container \"bdf40818b4c5e670ce79a2f09bb7a49a6c582c6e8d2f94b2fd1e3e2a95061123\": container with ID starting with bdf40818b4c5e670ce79a2f09bb7a49a6c582c6e8d2f94b2fd1e3e2a95061123 not found: ID does not exist" Apr 23 18:05:43.613566 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.613535 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-73769-predictor-7ff67dc9b9-rnljf"] Apr 23 18:05:43.646983 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.646895 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-73769-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-isvc-sklearn-graph-raw-73769-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:05:43.646983 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.646929 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d6fvh\" (UniqueName: \"kubernetes.io/projected/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-kube-api-access-d6fvh\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:05:43.646983 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.646944 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-proxy-tls\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:05:43.646983 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:43.646959 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe-kserve-provision-location\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:05:44.394561 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:44.394524 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" path="/var/lib/kubelet/pods/972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe/volumes" Apr 23 18:05:44.395081 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:44.395066 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" path="/var/lib/kubelet/pods/fe86fd31-2c47-4aac-869b-15a01eea4604/volumes" Apr 23 18:05:44.555435 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:44.555400 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" event={"ID":"38a8f284-3466-4635-8eed-34e77cafee06","Type":"ContainerStarted","Data":"e06dbb7c37dd74b6775c188cc62656a45975fbfe86b43638de9cb441ac85953f"} Apr 23 18:05:44.555435 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:44.555432 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" event={"ID":"38a8f284-3466-4635-8eed-34e77cafee06","Type":"ContainerStarted","Data":"185060cb79df7bef85c6b936616ae02b1b9b1b6efe872b3f2ba7da621a20c625"} Apr 23 18:05:44.555680 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:44.555646 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:44.557002 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:44.556983 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" event={"ID":"12caa729-dcf3-4776-aa94-964438c08d3e","Type":"ContainerStarted","Data":"1606f8eef76b99b61e30da163264a78e95f5517574979d5991358a0839198b80"} Apr 23 18:05:44.557093 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:44.557006 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" event={"ID":"12caa729-dcf3-4776-aa94-964438c08d3e","Type":"ContainerStarted","Data":"301d884e7214d4ac579a735fb0e8869270a2549260a81b7f4e63388e0199ef00"} Apr 23 18:05:44.557238 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:44.557202 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:44.557238 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:44.557236 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:44.558606 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:44.558575 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 18:05:44.573415 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:44.573360 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" podStartSLOduration=6.57334295 podStartE2EDuration="6.57334295s" podCreationTimestamp="2026-04-23 18:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:05:44.572591098 +0000 UTC m=+814.808037811" watchObservedRunningTime="2026-04-23 18:05:44.57334295 +0000 UTC m=+814.808789656" Apr 23 18:05:44.591524 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:44.591471 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" podStartSLOduration=5.591455542 podStartE2EDuration="5.591455542s" podCreationTimestamp="2026-04-23 18:05:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:05:44.590868201 +0000 UTC m=+814.826314903" watchObservedRunningTime="2026-04-23 18:05:44.591455542 +0000 UTC m=+814.826902284" Apr 23 18:05:45.563050 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:45.562778 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 18:05:45.563050 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:45.562978 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:45.564030 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:45.563997 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 18:05:46.564809 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:46.564765 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 18:05:50.565732 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:50.565703 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:05:50.566292 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:50.566267 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 18:05:51.569458 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:51.569430 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:05:51.569927 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:05:51.569895 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 18:06:00.566222 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:06:00.566181 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 18:06:01.570008 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:06:01.569973 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 18:06:10.566547 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:06:10.566503 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 18:06:11.570175 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:06:11.570137 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 18:06:20.566958 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:06:20.566914 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 18:06:21.570004 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:06:21.569965 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 18:06:30.567123 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:06:30.567081 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 18:06:31.569973 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:06:31.569932 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 18:06:40.566903 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:06:40.566862 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 18:06:41.570244 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:06:41.570200 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 18:06:50.566941 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:06:50.566905 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:06:51.570948 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:06:51.570919 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:07:10.323791 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:10.323760 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6g56n_ae56a92f-dfae-4763-b849-dca72bc2cf3d/kube-multus/0.log" Apr 23 18:07:10.325445 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:10.325423 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6g56n_ae56a92f-dfae-4763-b849-dca72bc2cf3d/kube-multus/0.log" Apr 23 18:07:19.163173 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.163135 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt"] Apr 23 18:07:19.163655 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.163494 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kserve-container" containerID="cri-o://185060cb79df7bef85c6b936616ae02b1b9b1b6efe872b3f2ba7da621a20c625" gracePeriod=30 Apr 23 18:07:19.163655 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.163516 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kube-rbac-proxy" containerID="cri-o://e06dbb7c37dd74b6775c188cc62656a45975fbfe86b43638de9cb441ac85953f" gracePeriod=30 Apr 23 18:07:19.204062 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204033 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2"] Apr 23 18:07:19.204338 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204326 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kube-rbac-proxy" Apr 23 18:07:19.204383 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204339 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kube-rbac-proxy" Apr 23 18:07:19.204383 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204357 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="storage-initializer" Apr 23 18:07:19.204383 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204362 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="storage-initializer" Apr 23 18:07:19.204383 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204371 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="storage-initializer" Apr 23 18:07:19.204383 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204376 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="storage-initializer" Apr 23 18:07:19.204383 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204383 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kserve-container" Apr 23 18:07:19.204559 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204388 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kserve-container" Apr 23 18:07:19.204559 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204394 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kserve-container" Apr 23 18:07:19.204559 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204399 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kserve-container" Apr 23 18:07:19.204559 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204412 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kube-rbac-proxy" Apr 23 18:07:19.204559 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204417 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kube-rbac-proxy" Apr 23 18:07:19.204559 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204459 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kserve-container" Apr 23 18:07:19.204559 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204468 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe86fd31-2c47-4aac-869b-15a01eea4604" containerName="kube-rbac-proxy" Apr 23 18:07:19.204559 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204475 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kube-rbac-proxy" Apr 23 18:07:19.204559 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.204483 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="972b4d2b-3e63-4c82-b97c-cd3ac27b7bbe" containerName="kserve-container" Apr 23 18:07:19.207320 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.207304 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:07:19.209144 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.209116 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-f1c6b-predictor-serving-cert\"" Apr 23 18:07:19.209347 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.209332 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-f1c6b-kube-rbac-proxy-sar-config\"" Apr 23 18:07:19.214859 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.214838 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2"] Apr 23 18:07:19.251198 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.251169 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x"] Apr 23 18:07:19.251518 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.251489 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kserve-container" containerID="cri-o://301d884e7214d4ac579a735fb0e8869270a2549260a81b7f4e63388e0199ef00" gracePeriod=30 Apr 23 18:07:19.251657 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.251534 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kube-rbac-proxy" containerID="cri-o://1606f8eef76b99b61e30da163264a78e95f5517574979d5991358a0839198b80" gracePeriod=30 Apr 23 18:07:19.292204 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.292165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-proxy-tls\") pod \"message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2\" (UID: \"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a\") " pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:07:19.292335 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.292261 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-raw-f1c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-message-dumper-raw-f1c6b-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2\" (UID: \"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a\") " pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:07:19.292335 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.292298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vcxt\" (UniqueName: \"kubernetes.io/projected/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-kube-api-access-6vcxt\") pod \"message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2\" (UID: \"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a\") " pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:07:19.393588 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.393556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-raw-f1c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-message-dumper-raw-f1c6b-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2\" (UID: \"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a\") " pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:07:19.393719 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.393604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vcxt\" (UniqueName: \"kubernetes.io/projected/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-kube-api-access-6vcxt\") pod \"message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2\" (UID: \"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a\") " pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:07:19.393719 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.393639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-proxy-tls\") pod \"message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2\" (UID: \"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a\") " pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:07:19.394182 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.394159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-raw-f1c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-message-dumper-raw-f1c6b-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2\" (UID: \"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a\") " pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:07:19.395849 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.395829 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-proxy-tls\") pod \"message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2\" (UID: \"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a\") " pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:07:19.401229 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.401207 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vcxt\" (UniqueName: \"kubernetes.io/projected/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-kube-api-access-6vcxt\") pod \"message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2\" (UID: \"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a\") " pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:07:19.518272 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.518182 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:07:19.632753 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.632595 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2"] Apr 23 18:07:19.635231 ip-10-0-132-102 kubenswrapper[2576]: W0423 18:07:19.635202 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6861b4e_d7a6_4e70_9d96_fd8771c8c90a.slice/crio-652e0053bb45191c031001dce3c8fa4e8aecfc8c5a172d6ac99bf08f76b08478 WatchSource:0}: Error finding container 652e0053bb45191c031001dce3c8fa4e8aecfc8c5a172d6ac99bf08f76b08478: Status 404 returned error can't find the container with id 652e0053bb45191c031001dce3c8fa4e8aecfc8c5a172d6ac99bf08f76b08478 Apr 23 18:07:19.827352 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.827268 2576 generic.go:358] "Generic (PLEG): container finished" podID="38a8f284-3466-4635-8eed-34e77cafee06" containerID="e06dbb7c37dd74b6775c188cc62656a45975fbfe86b43638de9cb441ac85953f" exitCode=2 Apr 23 18:07:19.827352 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.827338 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" event={"ID":"38a8f284-3466-4635-8eed-34e77cafee06","Type":"ContainerDied","Data":"e06dbb7c37dd74b6775c188cc62656a45975fbfe86b43638de9cb441ac85953f"} Apr 23 18:07:19.828980 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.828958 2576 generic.go:358] "Generic (PLEG): container finished" podID="12caa729-dcf3-4776-aa94-964438c08d3e" containerID="1606f8eef76b99b61e30da163264a78e95f5517574979d5991358a0839198b80" exitCode=2 Apr 23 18:07:19.829081 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.829029 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" event={"ID":"12caa729-dcf3-4776-aa94-964438c08d3e","Type":"ContainerDied","Data":"1606f8eef76b99b61e30da163264a78e95f5517574979d5991358a0839198b80"} Apr 23 18:07:19.829874 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:19.829857 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" event={"ID":"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a","Type":"ContainerStarted","Data":"652e0053bb45191c031001dce3c8fa4e8aecfc8c5a172d6ac99bf08f76b08478"} Apr 23 18:07:20.562354 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:20.562315 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.24:8643/healthz\": dial tcp 10.133.0.24:8643: connect: connection refused" Apr 23 18:07:20.566645 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:20.566618 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 18:07:20.834929 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:20.834842 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" event={"ID":"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a","Type":"ContainerStarted","Data":"560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c"} Apr 23 18:07:20.834929 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:20.834884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" event={"ID":"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a","Type":"ContainerStarted","Data":"2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a"} Apr 23 18:07:20.835098 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:20.835010 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:07:20.852495 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:20.852446 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" podStartSLOduration=0.914977543 podStartE2EDuration="1.852433449s" podCreationTimestamp="2026-04-23 18:07:19 +0000 UTC" firstStartedPulling="2026-04-23 18:07:19.637033011 +0000 UTC m=+909.872479692" lastFinishedPulling="2026-04-23 18:07:20.574488918 +0000 UTC m=+910.809935598" observedRunningTime="2026-04-23 18:07:20.850702019 +0000 UTC m=+911.086148722" watchObservedRunningTime="2026-04-23 18:07:20.852433449 +0000 UTC m=+911.087880151" Apr 23 18:07:21.565051 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:21.565012 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.23:8643/healthz\": dial tcp 10.133.0.23:8643: connect: connection refused" Apr 23 18:07:21.569957 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:21.569928 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 18:07:21.838571 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:21.838486 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:07:21.840249 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:21.840225 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:07:22.786180 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.786160 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:07:22.842730 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.842697 2576 generic.go:358] "Generic (PLEG): container finished" podID="12caa729-dcf3-4776-aa94-964438c08d3e" containerID="301d884e7214d4ac579a735fb0e8869270a2549260a81b7f4e63388e0199ef00" exitCode=0 Apr 23 18:07:22.842902 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.842763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" event={"ID":"12caa729-dcf3-4776-aa94-964438c08d3e","Type":"ContainerDied","Data":"301d884e7214d4ac579a735fb0e8869270a2549260a81b7f4e63388e0199ef00"} Apr 23 18:07:22.842902 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.842803 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" event={"ID":"12caa729-dcf3-4776-aa94-964438c08d3e","Type":"ContainerDied","Data":"b7d28ef1d3ab92c66f8f8892f2675390410620341e09e593b3bc9f67888855b9"} Apr 23 18:07:22.842902 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.842804 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x" Apr 23 18:07:22.842902 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.842877 2576 scope.go:117] "RemoveContainer" containerID="1606f8eef76b99b61e30da163264a78e95f5517574979d5991358a0839198b80" Apr 23 18:07:22.850246 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.850228 2576 scope.go:117] "RemoveContainer" containerID="301d884e7214d4ac579a735fb0e8869270a2549260a81b7f4e63388e0199ef00" Apr 23 18:07:22.856805 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.856788 2576 scope.go:117] "RemoveContainer" containerID="7f29cad094197d4d7fdeb8e2e80bd9ad80dcc3de40bc4c96e9a27bab328b45b3" Apr 23 18:07:22.863392 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.863370 2576 scope.go:117] "RemoveContainer" containerID="1606f8eef76b99b61e30da163264a78e95f5517574979d5991358a0839198b80" Apr 23 18:07:22.863636 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:07:22.863620 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1606f8eef76b99b61e30da163264a78e95f5517574979d5991358a0839198b80\": container with ID starting with 1606f8eef76b99b61e30da163264a78e95f5517574979d5991358a0839198b80 not found: ID does not exist" containerID="1606f8eef76b99b61e30da163264a78e95f5517574979d5991358a0839198b80" Apr 23 18:07:22.863685 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.863642 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1606f8eef76b99b61e30da163264a78e95f5517574979d5991358a0839198b80"} err="failed to get container status \"1606f8eef76b99b61e30da163264a78e95f5517574979d5991358a0839198b80\": rpc error: code = NotFound desc = could not find container \"1606f8eef76b99b61e30da163264a78e95f5517574979d5991358a0839198b80\": container with ID starting with 1606f8eef76b99b61e30da163264a78e95f5517574979d5991358a0839198b80 not found: ID does not exist" Apr 23 18:07:22.863685 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.863659 2576 scope.go:117] "RemoveContainer" containerID="301d884e7214d4ac579a735fb0e8869270a2549260a81b7f4e63388e0199ef00" Apr 23 18:07:22.863884 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:07:22.863868 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"301d884e7214d4ac579a735fb0e8869270a2549260a81b7f4e63388e0199ef00\": container with ID starting with 301d884e7214d4ac579a735fb0e8869270a2549260a81b7f4e63388e0199ef00 not found: ID does not exist" containerID="301d884e7214d4ac579a735fb0e8869270a2549260a81b7f4e63388e0199ef00" Apr 23 18:07:22.863927 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.863888 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"301d884e7214d4ac579a735fb0e8869270a2549260a81b7f4e63388e0199ef00"} err="failed to get container status \"301d884e7214d4ac579a735fb0e8869270a2549260a81b7f4e63388e0199ef00\": rpc error: code = NotFound desc = could not find container \"301d884e7214d4ac579a735fb0e8869270a2549260a81b7f4e63388e0199ef00\": container with ID starting with 301d884e7214d4ac579a735fb0e8869270a2549260a81b7f4e63388e0199ef00 not found: ID does not exist" Apr 23 18:07:22.863927 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.863901 2576 scope.go:117] "RemoveContainer" containerID="7f29cad094197d4d7fdeb8e2e80bd9ad80dcc3de40bc4c96e9a27bab328b45b3" Apr 23 18:07:22.864099 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:07:22.864079 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f29cad094197d4d7fdeb8e2e80bd9ad80dcc3de40bc4c96e9a27bab328b45b3\": container with ID starting with 7f29cad094197d4d7fdeb8e2e80bd9ad80dcc3de40bc4c96e9a27bab328b45b3 not found: ID does not exist" containerID="7f29cad094197d4d7fdeb8e2e80bd9ad80dcc3de40bc4c96e9a27bab328b45b3" Apr 23 18:07:22.864137 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.864105 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f29cad094197d4d7fdeb8e2e80bd9ad80dcc3de40bc4c96e9a27bab328b45b3"} err="failed to get container status \"7f29cad094197d4d7fdeb8e2e80bd9ad80dcc3de40bc4c96e9a27bab328b45b3\": rpc error: code = NotFound desc = could not find container \"7f29cad094197d4d7fdeb8e2e80bd9ad80dcc3de40bc4c96e9a27bab328b45b3\": container with ID starting with 7f29cad094197d4d7fdeb8e2e80bd9ad80dcc3de40bc4c96e9a27bab328b45b3 not found: ID does not exist" Apr 23 18:07:22.924693 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.924618 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12caa729-dcf3-4776-aa94-964438c08d3e-kserve-provision-location\") pod \"12caa729-dcf3-4776-aa94-964438c08d3e\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " Apr 23 18:07:22.924693 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.924680 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/12caa729-dcf3-4776-aa94-964438c08d3e-isvc-xgboost-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\") pod \"12caa729-dcf3-4776-aa94-964438c08d3e\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " Apr 23 18:07:22.924928 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.924701 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7x77\" (UniqueName: \"kubernetes.io/projected/12caa729-dcf3-4776-aa94-964438c08d3e-kube-api-access-c7x77\") pod \"12caa729-dcf3-4776-aa94-964438c08d3e\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " Apr 23 18:07:22.924928 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.924762 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12caa729-dcf3-4776-aa94-964438c08d3e-proxy-tls\") pod \"12caa729-dcf3-4776-aa94-964438c08d3e\" (UID: \"12caa729-dcf3-4776-aa94-964438c08d3e\") " Apr 23 18:07:22.925051 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.925020 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12caa729-dcf3-4776-aa94-964438c08d3e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "12caa729-dcf3-4776-aa94-964438c08d3e" (UID: "12caa729-dcf3-4776-aa94-964438c08d3e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:07:22.925112 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.925053 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12caa729-dcf3-4776-aa94-964438c08d3e-isvc-xgboost-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config") pod "12caa729-dcf3-4776-aa94-964438c08d3e" (UID: "12caa729-dcf3-4776-aa94-964438c08d3e"). InnerVolumeSpecName "isvc-xgboost-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:07:22.926874 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.926848 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12caa729-dcf3-4776-aa94-964438c08d3e-kube-api-access-c7x77" (OuterVolumeSpecName: "kube-api-access-c7x77") pod "12caa729-dcf3-4776-aa94-964438c08d3e" (UID: "12caa729-dcf3-4776-aa94-964438c08d3e"). InnerVolumeSpecName "kube-api-access-c7x77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:07:22.926874 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:22.926852 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12caa729-dcf3-4776-aa94-964438c08d3e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "12caa729-dcf3-4776-aa94-964438c08d3e" (UID: "12caa729-dcf3-4776-aa94-964438c08d3e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:07:23.025265 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.025239 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12caa729-dcf3-4776-aa94-964438c08d3e-proxy-tls\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:07:23.025265 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.025263 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12caa729-dcf3-4776-aa94-964438c08d3e-kserve-provision-location\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:07:23.025448 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.025273 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/12caa729-dcf3-4776-aa94-964438c08d3e-isvc-xgboost-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:07:23.025448 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.025287 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c7x77\" (UniqueName: \"kubernetes.io/projected/12caa729-dcf3-4776-aa94-964438c08d3e-kube-api-access-c7x77\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:07:23.164351 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.164320 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x"] Apr 23 18:07:23.168537 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.168513 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-050b4-predictor-5d6f74677-g4k5x"] Apr 23 18:07:23.395535 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.395514 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:07:23.530592 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.530424 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2scfw\" (UniqueName: \"kubernetes.io/projected/38a8f284-3466-4635-8eed-34e77cafee06-kube-api-access-2scfw\") pod \"38a8f284-3466-4635-8eed-34e77cafee06\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " Apr 23 18:07:23.530592 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.530466 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38a8f284-3466-4635-8eed-34e77cafee06-kserve-provision-location\") pod \"38a8f284-3466-4635-8eed-34e77cafee06\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " Apr 23 18:07:23.530592 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.530498 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38a8f284-3466-4635-8eed-34e77cafee06-proxy-tls\") pod \"38a8f284-3466-4635-8eed-34e77cafee06\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " Apr 23 18:07:23.530592 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.530551 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38a8f284-3466-4635-8eed-34e77cafee06-isvc-sklearn-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\") pod \"38a8f284-3466-4635-8eed-34e77cafee06\" (UID: \"38a8f284-3466-4635-8eed-34e77cafee06\") " Apr 23 18:07:23.549515 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.530808 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38a8f284-3466-4635-8eed-34e77cafee06-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "38a8f284-3466-4635-8eed-34e77cafee06" (UID: "38a8f284-3466-4635-8eed-34e77cafee06"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:07:23.549515 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.530948 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a8f284-3466-4635-8eed-34e77cafee06-isvc-sklearn-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config") pod "38a8f284-3466-4635-8eed-34e77cafee06" (UID: "38a8f284-3466-4635-8eed-34e77cafee06"). InnerVolumeSpecName "isvc-sklearn-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:07:23.549515 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.532536 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a8f284-3466-4635-8eed-34e77cafee06-kube-api-access-2scfw" (OuterVolumeSpecName: "kube-api-access-2scfw") pod "38a8f284-3466-4635-8eed-34e77cafee06" (UID: "38a8f284-3466-4635-8eed-34e77cafee06"). InnerVolumeSpecName "kube-api-access-2scfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:07:23.549515 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.532551 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a8f284-3466-4635-8eed-34e77cafee06-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "38a8f284-3466-4635-8eed-34e77cafee06" (UID: "38a8f284-3466-4635-8eed-34e77cafee06"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:07:23.631977 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.631951 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38a8f284-3466-4635-8eed-34e77cafee06-isvc-sklearn-graph-raw-hpa-050b4-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:07:23.632076 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.631980 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2scfw\" (UniqueName: \"kubernetes.io/projected/38a8f284-3466-4635-8eed-34e77cafee06-kube-api-access-2scfw\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:07:23.632076 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.631991 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38a8f284-3466-4635-8eed-34e77cafee06-kserve-provision-location\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:07:23.632076 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.632002 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38a8f284-3466-4635-8eed-34e77cafee06-proxy-tls\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:07:23.847823 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.847717 2576 generic.go:358] "Generic (PLEG): container finished" podID="38a8f284-3466-4635-8eed-34e77cafee06" containerID="185060cb79df7bef85c6b936616ae02b1b9b1b6efe872b3f2ba7da621a20c625" exitCode=0 Apr 23 18:07:23.847823 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.847778 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" event={"ID":"38a8f284-3466-4635-8eed-34e77cafee06","Type":"ContainerDied","Data":"185060cb79df7bef85c6b936616ae02b1b9b1b6efe872b3f2ba7da621a20c625"} Apr 23 18:07:23.847823 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.847812 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" event={"ID":"38a8f284-3466-4635-8eed-34e77cafee06","Type":"ContainerDied","Data":"02e7f81d06992d7fe5df60b777d3c5b98e482633dbf7b9a5ae867f2dac6e41aa"} Apr 23 18:07:23.848370 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.847828 2576 scope.go:117] "RemoveContainer" containerID="e06dbb7c37dd74b6775c188cc62656a45975fbfe86b43638de9cb441ac85953f" Apr 23 18:07:23.848370 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.847837 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt" Apr 23 18:07:23.856494 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.856475 2576 scope.go:117] "RemoveContainer" containerID="185060cb79df7bef85c6b936616ae02b1b9b1b6efe872b3f2ba7da621a20c625" Apr 23 18:07:23.863182 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.863166 2576 scope.go:117] "RemoveContainer" containerID="871b3727dd105c41e141f7fdc20a9c1def58bd5791d881a2fbc7019bb341ae7f" Apr 23 18:07:23.869546 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.869458 2576 scope.go:117] "RemoveContainer" containerID="e06dbb7c37dd74b6775c188cc62656a45975fbfe86b43638de9cb441ac85953f" Apr 23 18:07:23.872540 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:07:23.870463 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e06dbb7c37dd74b6775c188cc62656a45975fbfe86b43638de9cb441ac85953f\": container with ID starting with e06dbb7c37dd74b6775c188cc62656a45975fbfe86b43638de9cb441ac85953f not found: ID does not exist" containerID="e06dbb7c37dd74b6775c188cc62656a45975fbfe86b43638de9cb441ac85953f" Apr 23 18:07:23.872540 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.870510 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06dbb7c37dd74b6775c188cc62656a45975fbfe86b43638de9cb441ac85953f"} err="failed to get container status \"e06dbb7c37dd74b6775c188cc62656a45975fbfe86b43638de9cb441ac85953f\": rpc error: code = NotFound desc = could not find container \"e06dbb7c37dd74b6775c188cc62656a45975fbfe86b43638de9cb441ac85953f\": container with ID starting with e06dbb7c37dd74b6775c188cc62656a45975fbfe86b43638de9cb441ac85953f not found: ID does not exist" Apr 23 18:07:23.872540 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.870534 2576 scope.go:117] "RemoveContainer" containerID="185060cb79df7bef85c6b936616ae02b1b9b1b6efe872b3f2ba7da621a20c625" Apr 23 18:07:23.872540 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:07:23.870939 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"185060cb79df7bef85c6b936616ae02b1b9b1b6efe872b3f2ba7da621a20c625\": container with ID starting with 185060cb79df7bef85c6b936616ae02b1b9b1b6efe872b3f2ba7da621a20c625 not found: ID does not exist" containerID="185060cb79df7bef85c6b936616ae02b1b9b1b6efe872b3f2ba7da621a20c625" Apr 23 18:07:23.872540 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.870993 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185060cb79df7bef85c6b936616ae02b1b9b1b6efe872b3f2ba7da621a20c625"} err="failed to get container status \"185060cb79df7bef85c6b936616ae02b1b9b1b6efe872b3f2ba7da621a20c625\": rpc error: code = NotFound desc = could not find container \"185060cb79df7bef85c6b936616ae02b1b9b1b6efe872b3f2ba7da621a20c625\": container with ID starting with 185060cb79df7bef85c6b936616ae02b1b9b1b6efe872b3f2ba7da621a20c625 not found: ID does not exist" Apr 23 18:07:23.872540 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.871013 2576 scope.go:117] "RemoveContainer" containerID="871b3727dd105c41e141f7fdc20a9c1def58bd5791d881a2fbc7019bb341ae7f" Apr 23 18:07:23.872540 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.871863 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt"] Apr 23 18:07:23.872939 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:07:23.872588 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"871b3727dd105c41e141f7fdc20a9c1def58bd5791d881a2fbc7019bb341ae7f\": container with ID starting with 871b3727dd105c41e141f7fdc20a9c1def58bd5791d881a2fbc7019bb341ae7f not found: ID does not exist" containerID="871b3727dd105c41e141f7fdc20a9c1def58bd5791d881a2fbc7019bb341ae7f" Apr 23 18:07:23.872939 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.872618 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"871b3727dd105c41e141f7fdc20a9c1def58bd5791d881a2fbc7019bb341ae7f"} err="failed to get container status \"871b3727dd105c41e141f7fdc20a9c1def58bd5791d881a2fbc7019bb341ae7f\": rpc error: code = NotFound desc = could not find container \"871b3727dd105c41e141f7fdc20a9c1def58bd5791d881a2fbc7019bb341ae7f\": container with ID starting with 871b3727dd105c41e141f7fdc20a9c1def58bd5791d881a2fbc7019bb341ae7f not found: ID does not exist" Apr 23 18:07:23.876009 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:23.875985 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-050b4-predictor-55b9478559-2rnkt"] Apr 23 18:07:24.398684 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:24.394722 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" path="/var/lib/kubelet/pods/12caa729-dcf3-4776-aa94-964438c08d3e/volumes" Apr 23 18:07:24.398684 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:24.395480 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a8f284-3466-4635-8eed-34e77cafee06" path="/var/lib/kubelet/pods/38a8f284-3466-4635-8eed-34e77cafee06/volumes" Apr 23 18:07:28.854072 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:28.854045 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:07:29.270260 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270224 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p"] Apr 23 18:07:29.270492 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270480 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kserve-container" Apr 23 18:07:29.270607 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270494 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kserve-container" Apr 23 18:07:29.270607 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270503 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kube-rbac-proxy" Apr 23 18:07:29.270607 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270509 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kube-rbac-proxy" Apr 23 18:07:29.270607 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270518 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="storage-initializer" Apr 23 18:07:29.270607 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270524 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="storage-initializer" Apr 23 18:07:29.270607 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270537 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="storage-initializer" Apr 23 18:07:29.270607 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270544 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="storage-initializer" Apr 23 18:07:29.270607 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270549 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kserve-container" Apr 23 18:07:29.270607 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270554 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kserve-container" Apr 23 18:07:29.270607 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270561 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kube-rbac-proxy" Apr 23 18:07:29.270607 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270566 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kube-rbac-proxy" Apr 23 18:07:29.271143 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270616 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kserve-container" Apr 23 18:07:29.271143 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270624 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kube-rbac-proxy" Apr 23 18:07:29.271143 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270630 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="12caa729-dcf3-4776-aa94-964438c08d3e" containerName="kserve-container" Apr 23 18:07:29.271143 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.270637 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="38a8f284-3466-4635-8eed-34e77cafee06" containerName="kube-rbac-proxy" Apr 23 18:07:29.273657 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.273636 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:29.275669 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.275650 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-f1c6b-kube-rbac-proxy-sar-config\"" Apr 23 18:07:29.275796 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.275712 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-f1c6b-predictor-serving-cert\"" Apr 23 18:07:29.282922 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.282897 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p"] Apr 23 18:07:29.378832 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.378795 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-raw-f1c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/72df63a8-0670-4a87-b85a-926ad14f6594-isvc-logger-raw-f1c6b-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:29.378832 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.378837 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwq8v\" (UniqueName: \"kubernetes.io/projected/72df63a8-0670-4a87-b85a-926ad14f6594-kube-api-access-lwq8v\") pod \"isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:29.379017 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.378960 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72df63a8-0670-4a87-b85a-926ad14f6594-proxy-tls\") pod \"isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:29.379017 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.378987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72df63a8-0670-4a87-b85a-926ad14f6594-kserve-provision-location\") pod \"isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:29.479363 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.479321 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-raw-f1c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/72df63a8-0670-4a87-b85a-926ad14f6594-isvc-logger-raw-f1c6b-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:29.479509 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.479405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwq8v\" (UniqueName: \"kubernetes.io/projected/72df63a8-0670-4a87-b85a-926ad14f6594-kube-api-access-lwq8v\") pod \"isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:29.479643 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.479619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72df63a8-0670-4a87-b85a-926ad14f6594-proxy-tls\") pod \"isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:29.479889 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.479868 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72df63a8-0670-4a87-b85a-926ad14f6594-kserve-provision-location\") pod \"isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:29.480102 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.480076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-raw-f1c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/72df63a8-0670-4a87-b85a-926ad14f6594-isvc-logger-raw-f1c6b-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:29.480306 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.480286 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72df63a8-0670-4a87-b85a-926ad14f6594-kserve-provision-location\") pod \"isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:29.482102 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.482084 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72df63a8-0670-4a87-b85a-926ad14f6594-proxy-tls\") pod \"isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:29.487245 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.487222 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwq8v\" (UniqueName: \"kubernetes.io/projected/72df63a8-0670-4a87-b85a-926ad14f6594-kube-api-access-lwq8v\") pod \"isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:29.584585 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.584508 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:29.705507 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.705482 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p"] Apr 23 18:07:29.707700 ip-10-0-132-102 kubenswrapper[2576]: W0423 18:07:29.707656 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72df63a8_0670_4a87_b85a_926ad14f6594.slice/crio-eef4d8a84ba5128f355bf57e9c099a24aada15e5fc858c41ebc383c662abb237 WatchSource:0}: Error finding container eef4d8a84ba5128f355bf57e9c099a24aada15e5fc858c41ebc383c662abb237: Status 404 returned error can't find the container with id eef4d8a84ba5128f355bf57e9c099a24aada15e5fc858c41ebc383c662abb237 Apr 23 18:07:29.709438 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.709424 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:07:29.873342 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.868976 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" event={"ID":"72df63a8-0670-4a87-b85a-926ad14f6594","Type":"ContainerStarted","Data":"164fb8cba7f5f539f32c33e03f10ccf20f568062e8cf2960b9b8572ab919c422"} Apr 23 18:07:29.873342 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:29.869019 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" event={"ID":"72df63a8-0670-4a87-b85a-926ad14f6594","Type":"ContainerStarted","Data":"eef4d8a84ba5128f355bf57e9c099a24aada15e5fc858c41ebc383c662abb237"} Apr 23 18:07:33.881113 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:33.881077 2576 generic.go:358] "Generic (PLEG): container finished" podID="72df63a8-0670-4a87-b85a-926ad14f6594" containerID="164fb8cba7f5f539f32c33e03f10ccf20f568062e8cf2960b9b8572ab919c422" exitCode=0 Apr 23 18:07:33.881486 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:33.881149 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" event={"ID":"72df63a8-0670-4a87-b85a-926ad14f6594","Type":"ContainerDied","Data":"164fb8cba7f5f539f32c33e03f10ccf20f568062e8cf2960b9b8572ab919c422"} Apr 23 18:07:34.886054 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:34.886022 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" event={"ID":"72df63a8-0670-4a87-b85a-926ad14f6594","Type":"ContainerStarted","Data":"94358ffcb8f6fb883716968fed2ff1557b3f7b9f5ab2ee2c56400ffdbb5a0f1f"} Apr 23 18:07:34.886441 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:34.886066 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" event={"ID":"72df63a8-0670-4a87-b85a-926ad14f6594","Type":"ContainerStarted","Data":"a76ccde1c587d3e905a67e1e63d8efa0eeceb23dfb387683c32d8e0149e99f37"} Apr 23 18:07:34.886441 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:34.886080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" event={"ID":"72df63a8-0670-4a87-b85a-926ad14f6594","Type":"ContainerStarted","Data":"a88f86c80de4e679195b514f46e0c26ef5245690dab19f1bc5eef42430dc0dba"} Apr 23 18:07:34.886441 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:34.886376 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:34.886597 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:34.886512 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:34.887674 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:34.887649 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:07:34.905974 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:34.905934 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podStartSLOduration=5.9059220329999995 podStartE2EDuration="5.905922033s" podCreationTimestamp="2026-04-23 18:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:07:34.903980443 +0000 UTC m=+925.139427146" watchObservedRunningTime="2026-04-23 18:07:34.905922033 +0000 UTC m=+925.141368822" Apr 23 18:07:35.891226 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:35.891177 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:07:35.891636 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:35.891268 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:35.892346 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:35.892320 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:36.894554 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:36.894517 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:07:36.895008 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:36.894861 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:41.898990 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:41.898960 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:07:41.899569 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:41.899535 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:07:41.899909 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:41.899874 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:51.899813 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:51.899767 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:07:51.900321 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:07:51.900283 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:08:01.900152 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:01.900101 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:08:01.900638 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:01.900612 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:08:11.899470 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:11.899428 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:08:11.899963 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:11.899921 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:08:21.899857 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:21.899810 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:08:21.900308 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:21.900254 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:08:31.899914 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:31.899826 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:08:31.900376 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:31.900267 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:08:41.900378 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:41.900346 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:08:41.900794 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:41.900557 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:08:54.263423 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.263389 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2_e6861b4e-d7a6-4e70-9d96-fd8771c8c90a/kserve-container/0.log" Apr 23 18:08:54.456177 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.456149 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p"] Apr 23 18:08:54.456524 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.456486 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" containerID="cri-o://a88f86c80de4e679195b514f46e0c26ef5245690dab19f1bc5eef42430dc0dba" gracePeriod=30 Apr 23 18:08:54.456623 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.456593 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="agent" containerID="cri-o://94358ffcb8f6fb883716968fed2ff1557b3f7b9f5ab2ee2c56400ffdbb5a0f1f" gracePeriod=30 Apr 23 18:08:54.456686 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.456593 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kube-rbac-proxy" containerID="cri-o://a76ccde1c587d3e905a67e1e63d8efa0eeceb23dfb387683c32d8e0149e99f37" gracePeriod=30 Apr 23 18:08:54.485087 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.485061 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m"] Apr 23 18:08:54.487357 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.487340 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:08:54.489200 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.489176 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-8636b-predictor-serving-cert\"" Apr 23 18:08:54.489374 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.489357 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-8636b-kube-rbac-proxy-sar-config\"" Apr 23 18:08:54.499059 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.499040 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m"] Apr 23 18:08:54.557213 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.557129 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2"] Apr 23 18:08:54.557457 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.557433 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" podUID="e6861b4e-d7a6-4e70-9d96-fd8771c8c90a" containerName="kserve-container" containerID="cri-o://2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a" gracePeriod=30 Apr 23 18:08:54.557535 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.557481 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" podUID="e6861b4e-d7a6-4e70-9d96-fd8771c8c90a" containerName="kube-rbac-proxy" containerID="cri-o://560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c" gracePeriod=30 Apr 23 18:08:54.633694 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.633663 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-proxy-tls\") pod \"isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:08:54.633860 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.633705 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-scale-raw-8636b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-isvc-sklearn-scale-raw-8636b-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:08:54.633860 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.633736 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:08:54.633860 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.633834 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjbbg\" (UniqueName: \"kubernetes.io/projected/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-kube-api-access-hjbbg\") pod \"isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:08:54.735342 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.735290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-proxy-tls\") pod \"isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:08:54.735504 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.735349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-scale-raw-8636b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-isvc-sklearn-scale-raw-8636b-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:08:54.735504 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.735385 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:08:54.735504 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.735429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjbbg\" (UniqueName: \"kubernetes.io/projected/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-kube-api-access-hjbbg\") pod \"isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:08:54.736196 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.736171 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:08:54.736337 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.736177 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-scale-raw-8636b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-isvc-sklearn-scale-raw-8636b-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:08:54.738710 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.738682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-proxy-tls\") pod \"isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:08:54.745187 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.745161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjbbg\" (UniqueName: \"kubernetes.io/projected/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-kube-api-access-hjbbg\") pod \"isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:08:54.784876 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.784856 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:08:54.796884 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.796861 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:08:54.915581 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.915547 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m"] Apr 23 18:08:54.919062 ip-10-0-132-102 kubenswrapper[2576]: W0423 18:08:54.919033 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod405f9c55_cdf2_4708_a0cf_3c8d34ebbf7c.slice/crio-81b755f966b3dfad34f45d066ff1174f86742390988d82a71aa91b5579c0a2f1 WatchSource:0}: Error finding container 81b755f966b3dfad34f45d066ff1174f86742390988d82a71aa91b5579c0a2f1: Status 404 returned error can't find the container with id 81b755f966b3dfad34f45d066ff1174f86742390988d82a71aa91b5579c0a2f1 Apr 23 18:08:54.936393 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.936369 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-raw-f1c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-message-dumper-raw-f1c6b-kube-rbac-proxy-sar-config\") pod \"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a\" (UID: \"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a\") " Apr 23 18:08:54.936510 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.936423 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-proxy-tls\") pod \"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a\" (UID: \"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a\") " Apr 23 18:08:54.936510 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.936451 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vcxt\" (UniqueName: \"kubernetes.io/projected/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-kube-api-access-6vcxt\") pod \"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a\" (UID: \"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a\") " Apr 23 18:08:54.936828 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.936774 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-message-dumper-raw-f1c6b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-raw-f1c6b-kube-rbac-proxy-sar-config") pod "e6861b4e-d7a6-4e70-9d96-fd8771c8c90a" (UID: "e6861b4e-d7a6-4e70-9d96-fd8771c8c90a"). InnerVolumeSpecName "message-dumper-raw-f1c6b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:08:54.938611 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.938579 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e6861b4e-d7a6-4e70-9d96-fd8771c8c90a" (UID: "e6861b4e-d7a6-4e70-9d96-fd8771c8c90a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:08:54.938702 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:54.938675 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-kube-api-access-6vcxt" (OuterVolumeSpecName: "kube-api-access-6vcxt") pod "e6861b4e-d7a6-4e70-9d96-fd8771c8c90a" (UID: "e6861b4e-d7a6-4e70-9d96-fd8771c8c90a"). InnerVolumeSpecName "kube-api-access-6vcxt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:08:55.037919 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.037885 2576 reconciler_common.go:299] "Volume detached for volume \"message-dumper-raw-f1c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-message-dumper-raw-f1c6b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:08:55.037919 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.037914 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-proxy-tls\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:08:55.037919 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.037924 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6vcxt\" (UniqueName: \"kubernetes.io/projected/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a-kube-api-access-6vcxt\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:08:55.104234 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.104148 2576 generic.go:358] "Generic (PLEG): container finished" podID="e6861b4e-d7a6-4e70-9d96-fd8771c8c90a" containerID="560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c" exitCode=2 Apr 23 18:08:55.104234 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.104179 2576 generic.go:358] "Generic (PLEG): container finished" podID="e6861b4e-d7a6-4e70-9d96-fd8771c8c90a" containerID="2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a" exitCode=2 Apr 23 18:08:55.104234 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.104231 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" Apr 23 18:08:55.104522 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.104228 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" event={"ID":"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a","Type":"ContainerDied","Data":"560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c"} Apr 23 18:08:55.104522 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.104346 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" event={"ID":"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a","Type":"ContainerDied","Data":"2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a"} Apr 23 18:08:55.104522 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.104363 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2" event={"ID":"e6861b4e-d7a6-4e70-9d96-fd8771c8c90a","Type":"ContainerDied","Data":"652e0053bb45191c031001dce3c8fa4e8aecfc8c5a172d6ac99bf08f76b08478"} Apr 23 18:08:55.104522 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.104383 2576 scope.go:117] "RemoveContainer" containerID="560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c" Apr 23 18:08:55.106547 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.106526 2576 generic.go:358] "Generic (PLEG): container finished" podID="72df63a8-0670-4a87-b85a-926ad14f6594" containerID="a76ccde1c587d3e905a67e1e63d8efa0eeceb23dfb387683c32d8e0149e99f37" exitCode=2 Apr 23 18:08:55.106644 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.106577 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" event={"ID":"72df63a8-0670-4a87-b85a-926ad14f6594","Type":"ContainerDied","Data":"a76ccde1c587d3e905a67e1e63d8efa0eeceb23dfb387683c32d8e0149e99f37"} Apr 23 18:08:55.108110 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.108085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" event={"ID":"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c","Type":"ContainerStarted","Data":"38ea0cc89de3737c574dffe38f6fea9f1b23eab1152314be9bf7a0d835f1965c"} Apr 23 18:08:55.108197 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.108148 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" event={"ID":"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c","Type":"ContainerStarted","Data":"81b755f966b3dfad34f45d066ff1174f86742390988d82a71aa91b5579c0a2f1"} Apr 23 18:08:55.113026 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.113007 2576 scope.go:117] "RemoveContainer" containerID="2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a" Apr 23 18:08:55.124796 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.124771 2576 scope.go:117] "RemoveContainer" containerID="560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c" Apr 23 18:08:55.125157 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:08:55.125132 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c\": container with ID starting with 560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c not found: ID does not exist" containerID="560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c" Apr 23 18:08:55.125543 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.125167 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c"} err="failed to get container status \"560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c\": rpc error: code = NotFound desc = could not find container \"560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c\": container with ID starting with 560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c not found: ID does not exist" Apr 23 18:08:55.125543 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.125193 2576 scope.go:117] "RemoveContainer" containerID="2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a" Apr 23 18:08:55.125683 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:08:55.125536 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a\": container with ID starting with 2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a not found: ID does not exist" containerID="2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a" Apr 23 18:08:55.125683 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.125571 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a"} err="failed to get container status \"2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a\": rpc error: code = NotFound desc = could not find container \"2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a\": container with ID starting with 2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a not found: ID does not exist" Apr 23 18:08:55.125683 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.125596 2576 scope.go:117] "RemoveContainer" containerID="560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c" Apr 23 18:08:55.125895 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.125872 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c"} err="failed to get container status \"560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c\": rpc error: code = NotFound desc = could not find container \"560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c\": container with ID starting with 560b29f5ef9b5625f74330a25c839d36fee9ee2c94662ecda6d48f84669d4c2c not found: ID does not exist" Apr 23 18:08:55.125944 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.125897 2576 scope.go:117] "RemoveContainer" containerID="2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a" Apr 23 18:08:55.126118 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.126102 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a"} err="failed to get container status \"2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a\": rpc error: code = NotFound desc = could not find container \"2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a\": container with ID starting with 2b6d6beabecbab8ccf25f5bb16a5e8d8e3fa9635b1ee7b5d3b003cf25aa9cb7a not found: ID does not exist" Apr 23 18:08:55.138301 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.138277 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2"] Apr 23 18:08:55.143908 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:55.143885 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-f1c6b-predictor-5bf88487b4-vdqr2"] Apr 23 18:08:56.394997 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:56.394963 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6861b4e-d7a6-4e70-9d96-fd8771c8c90a" path="/var/lib/kubelet/pods/e6861b4e-d7a6-4e70-9d96-fd8771c8c90a/volumes" Apr 23 18:08:56.894901 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:56.894868 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 23 18:08:59.122009 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:59.121978 2576 generic.go:358] "Generic (PLEG): container finished" podID="72df63a8-0670-4a87-b85a-926ad14f6594" containerID="a88f86c80de4e679195b514f46e0c26ef5245690dab19f1bc5eef42430dc0dba" exitCode=0 Apr 23 18:08:59.122408 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:59.122052 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" event={"ID":"72df63a8-0670-4a87-b85a-926ad14f6594","Type":"ContainerDied","Data":"a88f86c80de4e679195b514f46e0c26ef5245690dab19f1bc5eef42430dc0dba"} Apr 23 18:08:59.123277 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:59.123256 2576 generic.go:358] "Generic (PLEG): container finished" podID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerID="38ea0cc89de3737c574dffe38f6fea9f1b23eab1152314be9bf7a0d835f1965c" exitCode=0 Apr 23 18:08:59.123368 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:08:59.123313 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" event={"ID":"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c","Type":"ContainerDied","Data":"38ea0cc89de3737c574dffe38f6fea9f1b23eab1152314be9bf7a0d835f1965c"} Apr 23 18:09:00.127736 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:00.127696 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" event={"ID":"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c","Type":"ContainerStarted","Data":"d9f56c7058876f754d2595f3031bb78b951377cac6ec8d78f9711fca54202bee"} Apr 23 18:09:00.127736 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:00.127732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" event={"ID":"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c","Type":"ContainerStarted","Data":"ab2f569ed9d443c126b4dff24d1d5ea60707b6e9e7f52c53e52f663acb3eee94"} Apr 23 18:09:00.128145 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:00.127926 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:09:00.145467 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:00.145420 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podStartSLOduration=6.145406359 podStartE2EDuration="6.145406359s" podCreationTimestamp="2026-04-23 18:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:09:00.144468757 +0000 UTC m=+1010.379915463" watchObservedRunningTime="2026-04-23 18:09:00.145406359 +0000 UTC m=+1010.380853062" Apr 23 18:09:01.130200 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:01.130170 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:09:01.131445 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:01.131416 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:09:01.895016 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:01.894971 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 23 18:09:01.900378 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:01.900338 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:09:01.900664 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:01.900644 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:09:02.137243 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:02.137190 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:09:06.894820 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:06.894737 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 23 18:09:06.895301 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:06.894898 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:09:07.139685 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:07.139654 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:09:07.140201 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:07.140173 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:09:11.895541 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:11.895490 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 23 18:09:11.899827 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:11.899790 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:09:11.900197 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:11.900173 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:09:16.895300 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:16.895255 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 23 18:09:17.141042 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:17.141003 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:09:21.895019 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:21.894971 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 23 18:09:21.900375 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:21.900340 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 18:09:21.900527 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:21.900447 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:09:21.900804 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:21.900781 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:09:21.900880 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:21.900865 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:09:25.100975 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.100949 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:09:25.199373 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.199341 2576 generic.go:358] "Generic (PLEG): container finished" podID="72df63a8-0670-4a87-b85a-926ad14f6594" containerID="94358ffcb8f6fb883716968fed2ff1557b3f7b9f5ab2ee2c56400ffdbb5a0f1f" exitCode=0 Apr 23 18:09:25.199558 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.199429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" event={"ID":"72df63a8-0670-4a87-b85a-926ad14f6594","Type":"ContainerDied","Data":"94358ffcb8f6fb883716968fed2ff1557b3f7b9f5ab2ee2c56400ffdbb5a0f1f"} Apr 23 18:09:25.199558 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.199465 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" event={"ID":"72df63a8-0670-4a87-b85a-926ad14f6594","Type":"ContainerDied","Data":"eef4d8a84ba5128f355bf57e9c099a24aada15e5fc858c41ebc383c662abb237"} Apr 23 18:09:25.199558 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.199481 2576 scope.go:117] "RemoveContainer" containerID="94358ffcb8f6fb883716968fed2ff1557b3f7b9f5ab2ee2c56400ffdbb5a0f1f" Apr 23 18:09:25.199558 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.199439 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p" Apr 23 18:09:25.207186 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.207164 2576 scope.go:117] "RemoveContainer" containerID="a76ccde1c587d3e905a67e1e63d8efa0eeceb23dfb387683c32d8e0149e99f37" Apr 23 18:09:25.214273 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.214247 2576 scope.go:117] "RemoveContainer" containerID="a88f86c80de4e679195b514f46e0c26ef5245690dab19f1bc5eef42430dc0dba" Apr 23 18:09:25.221557 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.221510 2576 scope.go:117] "RemoveContainer" containerID="164fb8cba7f5f539f32c33e03f10ccf20f568062e8cf2960b9b8572ab919c422" Apr 23 18:09:25.228395 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.228374 2576 scope.go:117] "RemoveContainer" containerID="94358ffcb8f6fb883716968fed2ff1557b3f7b9f5ab2ee2c56400ffdbb5a0f1f" Apr 23 18:09:25.228669 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:09:25.228649 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94358ffcb8f6fb883716968fed2ff1557b3f7b9f5ab2ee2c56400ffdbb5a0f1f\": container with ID starting with 94358ffcb8f6fb883716968fed2ff1557b3f7b9f5ab2ee2c56400ffdbb5a0f1f not found: ID does not exist" containerID="94358ffcb8f6fb883716968fed2ff1557b3f7b9f5ab2ee2c56400ffdbb5a0f1f" Apr 23 18:09:25.228730 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.228677 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94358ffcb8f6fb883716968fed2ff1557b3f7b9f5ab2ee2c56400ffdbb5a0f1f"} err="failed to get container status \"94358ffcb8f6fb883716968fed2ff1557b3f7b9f5ab2ee2c56400ffdbb5a0f1f\": rpc error: code = NotFound desc = could not find container \"94358ffcb8f6fb883716968fed2ff1557b3f7b9f5ab2ee2c56400ffdbb5a0f1f\": container with ID starting with 94358ffcb8f6fb883716968fed2ff1557b3f7b9f5ab2ee2c56400ffdbb5a0f1f not found: ID does not exist" Apr 23 18:09:25.228730 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.228697 2576 scope.go:117] "RemoveContainer" containerID="a76ccde1c587d3e905a67e1e63d8efa0eeceb23dfb387683c32d8e0149e99f37" Apr 23 18:09:25.228948 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:09:25.228931 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a76ccde1c587d3e905a67e1e63d8efa0eeceb23dfb387683c32d8e0149e99f37\": container with ID starting with a76ccde1c587d3e905a67e1e63d8efa0eeceb23dfb387683c32d8e0149e99f37 not found: ID does not exist" containerID="a76ccde1c587d3e905a67e1e63d8efa0eeceb23dfb387683c32d8e0149e99f37" Apr 23 18:09:25.228990 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.228956 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a76ccde1c587d3e905a67e1e63d8efa0eeceb23dfb387683c32d8e0149e99f37"} err="failed to get container status \"a76ccde1c587d3e905a67e1e63d8efa0eeceb23dfb387683c32d8e0149e99f37\": rpc error: code = NotFound desc = could not find container \"a76ccde1c587d3e905a67e1e63d8efa0eeceb23dfb387683c32d8e0149e99f37\": container with ID starting with a76ccde1c587d3e905a67e1e63d8efa0eeceb23dfb387683c32d8e0149e99f37 not found: ID does not exist" Apr 23 18:09:25.228990 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.228971 2576 scope.go:117] "RemoveContainer" containerID="a88f86c80de4e679195b514f46e0c26ef5245690dab19f1bc5eef42430dc0dba" Apr 23 18:09:25.229134 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:09:25.229118 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a88f86c80de4e679195b514f46e0c26ef5245690dab19f1bc5eef42430dc0dba\": container with ID starting with a88f86c80de4e679195b514f46e0c26ef5245690dab19f1bc5eef42430dc0dba not found: ID does not exist" containerID="a88f86c80de4e679195b514f46e0c26ef5245690dab19f1bc5eef42430dc0dba" Apr 23 18:09:25.229179 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.229140 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a88f86c80de4e679195b514f46e0c26ef5245690dab19f1bc5eef42430dc0dba"} err="failed to get container status \"a88f86c80de4e679195b514f46e0c26ef5245690dab19f1bc5eef42430dc0dba\": rpc error: code = NotFound desc = could not find container \"a88f86c80de4e679195b514f46e0c26ef5245690dab19f1bc5eef42430dc0dba\": container with ID starting with a88f86c80de4e679195b514f46e0c26ef5245690dab19f1bc5eef42430dc0dba not found: ID does not exist" Apr 23 18:09:25.229179 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.229157 2576 scope.go:117] "RemoveContainer" containerID="164fb8cba7f5f539f32c33e03f10ccf20f568062e8cf2960b9b8572ab919c422" Apr 23 18:09:25.229377 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:09:25.229363 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"164fb8cba7f5f539f32c33e03f10ccf20f568062e8cf2960b9b8572ab919c422\": container with ID starting with 164fb8cba7f5f539f32c33e03f10ccf20f568062e8cf2960b9b8572ab919c422 not found: ID does not exist" containerID="164fb8cba7f5f539f32c33e03f10ccf20f568062e8cf2960b9b8572ab919c422" Apr 23 18:09:25.229425 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.229382 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"164fb8cba7f5f539f32c33e03f10ccf20f568062e8cf2960b9b8572ab919c422"} err="failed to get container status \"164fb8cba7f5f539f32c33e03f10ccf20f568062e8cf2960b9b8572ab919c422\": rpc error: code = NotFound desc = could not find container \"164fb8cba7f5f539f32c33e03f10ccf20f568062e8cf2960b9b8572ab919c422\": container with ID starting with 164fb8cba7f5f539f32c33e03f10ccf20f568062e8cf2960b9b8572ab919c422 not found: ID does not exist" Apr 23 18:09:25.280885 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.280849 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72df63a8-0670-4a87-b85a-926ad14f6594-proxy-tls\") pod \"72df63a8-0670-4a87-b85a-926ad14f6594\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " Apr 23 18:09:25.281013 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.280903 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-raw-f1c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/72df63a8-0670-4a87-b85a-926ad14f6594-isvc-logger-raw-f1c6b-kube-rbac-proxy-sar-config\") pod \"72df63a8-0670-4a87-b85a-926ad14f6594\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " Apr 23 18:09:25.281013 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.280925 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwq8v\" (UniqueName: \"kubernetes.io/projected/72df63a8-0670-4a87-b85a-926ad14f6594-kube-api-access-lwq8v\") pod \"72df63a8-0670-4a87-b85a-926ad14f6594\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " Apr 23 18:09:25.281013 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.280969 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72df63a8-0670-4a87-b85a-926ad14f6594-kserve-provision-location\") pod \"72df63a8-0670-4a87-b85a-926ad14f6594\" (UID: \"72df63a8-0670-4a87-b85a-926ad14f6594\") " Apr 23 18:09:25.281383 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.281346 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72df63a8-0670-4a87-b85a-926ad14f6594-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "72df63a8-0670-4a87-b85a-926ad14f6594" (UID: "72df63a8-0670-4a87-b85a-926ad14f6594"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:09:25.281459 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.281356 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72df63a8-0670-4a87-b85a-926ad14f6594-isvc-logger-raw-f1c6b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-raw-f1c6b-kube-rbac-proxy-sar-config") pod "72df63a8-0670-4a87-b85a-926ad14f6594" (UID: "72df63a8-0670-4a87-b85a-926ad14f6594"). InnerVolumeSpecName "isvc-logger-raw-f1c6b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:09:25.283084 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.283054 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72df63a8-0670-4a87-b85a-926ad14f6594-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "72df63a8-0670-4a87-b85a-926ad14f6594" (UID: "72df63a8-0670-4a87-b85a-926ad14f6594"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:09:25.283172 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.283084 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72df63a8-0670-4a87-b85a-926ad14f6594-kube-api-access-lwq8v" (OuterVolumeSpecName: "kube-api-access-lwq8v") pod "72df63a8-0670-4a87-b85a-926ad14f6594" (UID: "72df63a8-0670-4a87-b85a-926ad14f6594"). InnerVolumeSpecName "kube-api-access-lwq8v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:09:25.382144 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.382106 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72df63a8-0670-4a87-b85a-926ad14f6594-proxy-tls\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:09:25.382144 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.382135 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-raw-f1c6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/72df63a8-0670-4a87-b85a-926ad14f6594-isvc-logger-raw-f1c6b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:09:25.382144 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.382147 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lwq8v\" (UniqueName: \"kubernetes.io/projected/72df63a8-0670-4a87-b85a-926ad14f6594-kube-api-access-lwq8v\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:09:25.382378 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.382159 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72df63a8-0670-4a87-b85a-926ad14f6594-kserve-provision-location\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:09:25.520500 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.520467 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p"] Apr 23 18:09:25.527019 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:25.526991 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-f1c6b-predictor-796f9c8994-2429p"] Apr 23 18:09:26.393448 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:26.393415 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" path="/var/lib/kubelet/pods/72df63a8-0670-4a87-b85a-926ad14f6594/volumes" Apr 23 18:09:27.140510 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:27.140470 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:09:37.140219 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:37.140174 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:09:47.140915 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:47.140873 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:09:57.140883 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:09:57.140803 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:10:07.140371 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:10:07.140328 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:10:17.140175 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:10:17.140137 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:10:27.141146 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:10:27.141099 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:10:34.390043 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:10:34.390002 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:10:44.390613 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:10:44.390566 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:10:54.390022 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:10:54.389977 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:11:04.390154 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:04.390110 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 18:11:14.393047 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:14.393020 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:11:24.697659 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.697571 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m"] Apr 23 18:11:24.698225 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.697957 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" containerID="cri-o://ab2f569ed9d443c126b4dff24d1d5ea60707b6e9e7f52c53e52f663acb3eee94" gracePeriod=30 Apr 23 18:11:24.698225 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.698027 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kube-rbac-proxy" containerID="cri-o://d9f56c7058876f754d2595f3031bb78b951377cac6ec8d78f9711fca54202bee" gracePeriod=30 Apr 23 18:11:24.794703 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.794668 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62"] Apr 23 18:11:24.794954 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.794941 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kube-rbac-proxy" Apr 23 18:11:24.795002 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.794955 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kube-rbac-proxy" Apr 23 18:11:24.795002 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.794967 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="agent" Apr 23 18:11:24.795002 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.794973 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="agent" Apr 23 18:11:24.795002 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.794981 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="storage-initializer" Apr 23 18:11:24.795002 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.794986 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="storage-initializer" Apr 23 18:11:24.795002 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.794993 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" Apr 23 18:11:24.795002 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.794998 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" Apr 23 18:11:24.795211 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.795012 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6861b4e-d7a6-4e70-9d96-fd8771c8c90a" containerName="kserve-container" Apr 23 18:11:24.795211 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.795018 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6861b4e-d7a6-4e70-9d96-fd8771c8c90a" containerName="kserve-container" Apr 23 18:11:24.795211 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.795027 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6861b4e-d7a6-4e70-9d96-fd8771c8c90a" containerName="kube-rbac-proxy" Apr 23 18:11:24.795211 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.795032 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6861b4e-d7a6-4e70-9d96-fd8771c8c90a" containerName="kube-rbac-proxy" Apr 23 18:11:24.795211 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.795071 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6861b4e-d7a6-4e70-9d96-fd8771c8c90a" containerName="kube-rbac-proxy" Apr 23 18:11:24.795211 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.795077 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6861b4e-d7a6-4e70-9d96-fd8771c8c90a" containerName="kserve-container" Apr 23 18:11:24.795211 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.795083 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kube-rbac-proxy" Apr 23 18:11:24.795211 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.795088 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="agent" Apr 23 18:11:24.795211 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.795096 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="72df63a8-0670-4a87-b85a-926ad14f6594" containerName="kserve-container" Apr 23 18:11:24.798151 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.798121 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:24.800015 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.799993 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-b0e9d2-predictor-serving-cert\"" Apr 23 18:11:24.800143 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.800015 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-b0e9d2-kube-rbac-proxy-sar-config\"" Apr 23 18:11:24.805998 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.805973 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62"] Apr 23 18:11:24.878767 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.878703 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b819f9d6-5d46-4210-96a1-25228eae230f-proxy-tls\") pod \"isvc-primary-b0e9d2-predictor-54c659f8-4mp62\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:24.878767 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.878763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-b0e9d2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b819f9d6-5d46-4210-96a1-25228eae230f-isvc-primary-b0e9d2-kube-rbac-proxy-sar-config\") pod \"isvc-primary-b0e9d2-predictor-54c659f8-4mp62\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:24.878978 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.878796 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjdft\" (UniqueName: \"kubernetes.io/projected/b819f9d6-5d46-4210-96a1-25228eae230f-kube-api-access-wjdft\") pod \"isvc-primary-b0e9d2-predictor-54c659f8-4mp62\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:24.878978 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.878898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b819f9d6-5d46-4210-96a1-25228eae230f-kserve-provision-location\") pod \"isvc-primary-b0e9d2-predictor-54c659f8-4mp62\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:24.979879 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.979791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b819f9d6-5d46-4210-96a1-25228eae230f-kserve-provision-location\") pod \"isvc-primary-b0e9d2-predictor-54c659f8-4mp62\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:24.979879 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.979838 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b819f9d6-5d46-4210-96a1-25228eae230f-proxy-tls\") pod \"isvc-primary-b0e9d2-predictor-54c659f8-4mp62\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:24.980072 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:11:24.979936 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-serving-cert: secret "isvc-primary-b0e9d2-predictor-serving-cert" not found Apr 23 18:11:24.980072 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.979961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-b0e9d2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b819f9d6-5d46-4210-96a1-25228eae230f-isvc-primary-b0e9d2-kube-rbac-proxy-sar-config\") pod \"isvc-primary-b0e9d2-predictor-54c659f8-4mp62\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:24.980072 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:11:24.980006 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b819f9d6-5d46-4210-96a1-25228eae230f-proxy-tls podName:b819f9d6-5d46-4210-96a1-25228eae230f nodeName:}" failed. No retries permitted until 2026-04-23 18:11:25.479988996 +0000 UTC m=+1155.715435677 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b819f9d6-5d46-4210-96a1-25228eae230f-proxy-tls") pod "isvc-primary-b0e9d2-predictor-54c659f8-4mp62" (UID: "b819f9d6-5d46-4210-96a1-25228eae230f") : secret "isvc-primary-b0e9d2-predictor-serving-cert" not found Apr 23 18:11:24.980072 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.980031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjdft\" (UniqueName: \"kubernetes.io/projected/b819f9d6-5d46-4210-96a1-25228eae230f-kube-api-access-wjdft\") pod \"isvc-primary-b0e9d2-predictor-54c659f8-4mp62\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:24.980281 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.980204 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b819f9d6-5d46-4210-96a1-25228eae230f-kserve-provision-location\") pod \"isvc-primary-b0e9d2-predictor-54c659f8-4mp62\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:24.980521 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.980499 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-b0e9d2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b819f9d6-5d46-4210-96a1-25228eae230f-isvc-primary-b0e9d2-kube-rbac-proxy-sar-config\") pod \"isvc-primary-b0e9d2-predictor-54c659f8-4mp62\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:24.991139 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:24.991118 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjdft\" (UniqueName: \"kubernetes.io/projected/b819f9d6-5d46-4210-96a1-25228eae230f-kube-api-access-wjdft\") pod \"isvc-primary-b0e9d2-predictor-54c659f8-4mp62\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:25.483707 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:25.483675 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b819f9d6-5d46-4210-96a1-25228eae230f-proxy-tls\") pod \"isvc-primary-b0e9d2-predictor-54c659f8-4mp62\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:25.485889 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:25.485866 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b819f9d6-5d46-4210-96a1-25228eae230f-proxy-tls\") pod \"isvc-primary-b0e9d2-predictor-54c659f8-4mp62\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:25.525013 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:25.524981 2576 generic.go:358] "Generic (PLEG): container finished" podID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerID="d9f56c7058876f754d2595f3031bb78b951377cac6ec8d78f9711fca54202bee" exitCode=2 Apr 23 18:11:25.525013 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:25.525016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" event={"ID":"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c","Type":"ContainerDied","Data":"d9f56c7058876f754d2595f3031bb78b951377cac6ec8d78f9711fca54202bee"} Apr 23 18:11:25.709974 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:25.709942 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:25.832305 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:25.832276 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62"] Apr 23 18:11:25.835228 ip-10-0-132-102 kubenswrapper[2576]: W0423 18:11:25.835202 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb819f9d6_5d46_4210_96a1_25228eae230f.slice/crio-805147926d2d43515cd835a423c65ba77cc97dbfddd5d5bbea8191b91322d998 WatchSource:0}: Error finding container 805147926d2d43515cd835a423c65ba77cc97dbfddd5d5bbea8191b91322d998: Status 404 returned error can't find the container with id 805147926d2d43515cd835a423c65ba77cc97dbfddd5d5bbea8191b91322d998 Apr 23 18:11:26.528840 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:26.528804 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" event={"ID":"b819f9d6-5d46-4210-96a1-25228eae230f","Type":"ContainerStarted","Data":"835199f614880428bb1ff8b457884ed5224f386eec7198f1641869f9132d19ec"} Apr 23 18:11:26.528840 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:26.528843 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" event={"ID":"b819f9d6-5d46-4210-96a1-25228eae230f","Type":"ContainerStarted","Data":"805147926d2d43515cd835a423c65ba77cc97dbfddd5d5bbea8191b91322d998"} Apr 23 18:11:27.136078 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:27.136038 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 23 18:11:29.537090 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:29.537052 2576 generic.go:358] "Generic (PLEG): container finished" podID="b819f9d6-5d46-4210-96a1-25228eae230f" containerID="835199f614880428bb1ff8b457884ed5224f386eec7198f1641869f9132d19ec" exitCode=0 Apr 23 18:11:29.537565 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:29.537130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" event={"ID":"b819f9d6-5d46-4210-96a1-25228eae230f","Type":"ContainerDied","Data":"835199f614880428bb1ff8b457884ed5224f386eec7198f1641869f9132d19ec"} Apr 23 18:11:30.541629 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:30.541594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" event={"ID":"b819f9d6-5d46-4210-96a1-25228eae230f","Type":"ContainerStarted","Data":"a102c6fe8d7ee5bff4b597fa46dfe3fa5bd491518bb4082a808d8fe78747c273"} Apr 23 18:11:30.541629 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:30.541630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" event={"ID":"b819f9d6-5d46-4210-96a1-25228eae230f","Type":"ContainerStarted","Data":"51f91d580d3fbd55d2103cf98198fadb1e2326aa7d65c02f1c04eb0ee317c01b"} Apr 23 18:11:30.542194 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:30.541837 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:30.560532 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:30.560480 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" podStartSLOduration=6.5604661029999995 podStartE2EDuration="6.560466103s" podCreationTimestamp="2026-04-23 18:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:11:30.558890165 +0000 UTC m=+1160.794336868" watchObservedRunningTime="2026-04-23 18:11:30.560466103 +0000 UTC m=+1160.795912807" Apr 23 18:11:31.544066 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:31.544027 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:31.545506 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:31.545478 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:11:32.136004 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:32.135963 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 23 18:11:32.546974 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:32.546883 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:11:33.935837 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:33.935812 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:11:34.050795 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.050682 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjbbg\" (UniqueName: \"kubernetes.io/projected/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-kube-api-access-hjbbg\") pod \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " Apr 23 18:11:34.050978 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.050805 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-kserve-provision-location\") pod \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " Apr 23 18:11:34.050978 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.050835 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-scale-raw-8636b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-isvc-sklearn-scale-raw-8636b-kube-rbac-proxy-sar-config\") pod \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " Apr 23 18:11:34.050978 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.050880 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-proxy-tls\") pod \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\" (UID: \"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c\") " Apr 23 18:11:34.051176 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.051145 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" (UID: "405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:11:34.051176 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.051162 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-isvc-sklearn-scale-raw-8636b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-scale-raw-8636b-kube-rbac-proxy-sar-config") pod "405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" (UID: "405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c"). InnerVolumeSpecName "isvc-sklearn-scale-raw-8636b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:11:34.052850 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.052825 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" (UID: "405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:11:34.053212 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.052889 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-kube-api-access-hjbbg" (OuterVolumeSpecName: "kube-api-access-hjbbg") pod "405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" (UID: "405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c"). InnerVolumeSpecName "kube-api-access-hjbbg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:11:34.151456 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.151413 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-kserve-provision-location\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:11:34.151456 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.151448 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-scale-raw-8636b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-isvc-sklearn-scale-raw-8636b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:11:34.151456 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.151459 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-proxy-tls\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:11:34.151690 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.151470 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hjbbg\" (UniqueName: \"kubernetes.io/projected/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c-kube-api-access-hjbbg\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:11:34.554203 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.554166 2576 generic.go:358] "Generic (PLEG): container finished" podID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerID="ab2f569ed9d443c126b4dff24d1d5ea60707b6e9e7f52c53e52f663acb3eee94" exitCode=0 Apr 23 18:11:34.554362 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.554252 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" Apr 23 18:11:34.554362 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.554253 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" event={"ID":"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c","Type":"ContainerDied","Data":"ab2f569ed9d443c126b4dff24d1d5ea60707b6e9e7f52c53e52f663acb3eee94"} Apr 23 18:11:34.554362 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.554358 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m" event={"ID":"405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c","Type":"ContainerDied","Data":"81b755f966b3dfad34f45d066ff1174f86742390988d82a71aa91b5579c0a2f1"} Apr 23 18:11:34.554480 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.554374 2576 scope.go:117] "RemoveContainer" containerID="d9f56c7058876f754d2595f3031bb78b951377cac6ec8d78f9711fca54202bee" Apr 23 18:11:34.564184 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.564143 2576 scope.go:117] "RemoveContainer" containerID="ab2f569ed9d443c126b4dff24d1d5ea60707b6e9e7f52c53e52f663acb3eee94" Apr 23 18:11:34.571024 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.570999 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m"] Apr 23 18:11:34.571783 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.571763 2576 scope.go:117] "RemoveContainer" containerID="38ea0cc89de3737c574dffe38f6fea9f1b23eab1152314be9bf7a0d835f1965c" Apr 23 18:11:34.576045 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.576019 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-8636b-predictor-57d74786b5-x2k4m"] Apr 23 18:11:34.578949 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.578928 2576 scope.go:117] "RemoveContainer" containerID="d9f56c7058876f754d2595f3031bb78b951377cac6ec8d78f9711fca54202bee" Apr 23 18:11:34.579197 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:11:34.579177 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9f56c7058876f754d2595f3031bb78b951377cac6ec8d78f9711fca54202bee\": container with ID starting with d9f56c7058876f754d2595f3031bb78b951377cac6ec8d78f9711fca54202bee not found: ID does not exist" containerID="d9f56c7058876f754d2595f3031bb78b951377cac6ec8d78f9711fca54202bee" Apr 23 18:11:34.579251 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.579206 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9f56c7058876f754d2595f3031bb78b951377cac6ec8d78f9711fca54202bee"} err="failed to get container status \"d9f56c7058876f754d2595f3031bb78b951377cac6ec8d78f9711fca54202bee\": rpc error: code = NotFound desc = could not find container \"d9f56c7058876f754d2595f3031bb78b951377cac6ec8d78f9711fca54202bee\": container with ID starting with d9f56c7058876f754d2595f3031bb78b951377cac6ec8d78f9711fca54202bee not found: ID does not exist" Apr 23 18:11:34.579251 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.579225 2576 scope.go:117] "RemoveContainer" containerID="ab2f569ed9d443c126b4dff24d1d5ea60707b6e9e7f52c53e52f663acb3eee94" Apr 23 18:11:34.579462 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:11:34.579444 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab2f569ed9d443c126b4dff24d1d5ea60707b6e9e7f52c53e52f663acb3eee94\": container with ID starting with ab2f569ed9d443c126b4dff24d1d5ea60707b6e9e7f52c53e52f663acb3eee94 not found: ID does not exist" containerID="ab2f569ed9d443c126b4dff24d1d5ea60707b6e9e7f52c53e52f663acb3eee94" Apr 23 18:11:34.579512 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.579468 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab2f569ed9d443c126b4dff24d1d5ea60707b6e9e7f52c53e52f663acb3eee94"} err="failed to get container status \"ab2f569ed9d443c126b4dff24d1d5ea60707b6e9e7f52c53e52f663acb3eee94\": rpc error: code = NotFound desc = could not find container \"ab2f569ed9d443c126b4dff24d1d5ea60707b6e9e7f52c53e52f663acb3eee94\": container with ID starting with ab2f569ed9d443c126b4dff24d1d5ea60707b6e9e7f52c53e52f663acb3eee94 not found: ID does not exist" Apr 23 18:11:34.579512 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.579484 2576 scope.go:117] "RemoveContainer" containerID="38ea0cc89de3737c574dffe38f6fea9f1b23eab1152314be9bf7a0d835f1965c" Apr 23 18:11:34.579706 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:11:34.579692 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ea0cc89de3737c574dffe38f6fea9f1b23eab1152314be9bf7a0d835f1965c\": container with ID starting with 38ea0cc89de3737c574dffe38f6fea9f1b23eab1152314be9bf7a0d835f1965c not found: ID does not exist" containerID="38ea0cc89de3737c574dffe38f6fea9f1b23eab1152314be9bf7a0d835f1965c" Apr 23 18:11:34.579760 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:34.579709 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ea0cc89de3737c574dffe38f6fea9f1b23eab1152314be9bf7a0d835f1965c"} err="failed to get container status \"38ea0cc89de3737c574dffe38f6fea9f1b23eab1152314be9bf7a0d835f1965c\": rpc error: code = NotFound desc = could not find container \"38ea0cc89de3737c574dffe38f6fea9f1b23eab1152314be9bf7a0d835f1965c\": container with ID starting with 38ea0cc89de3737c574dffe38f6fea9f1b23eab1152314be9bf7a0d835f1965c not found: ID does not exist" Apr 23 18:11:36.393417 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:36.393382 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" path="/var/lib/kubelet/pods/405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c/volumes" Apr 23 18:11:37.551452 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:37.551425 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:11:37.552026 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:37.552000 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:11:47.552795 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:47.552737 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:11:57.552705 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:11:57.552662 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:12:07.552992 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:07.552947 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:12:10.344071 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:10.344044 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6g56n_ae56a92f-dfae-4763-b849-dca72bc2cf3d/kube-multus/0.log" Apr 23 18:12:10.346172 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:10.346152 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6g56n_ae56a92f-dfae-4763-b849-dca72bc2cf3d/kube-multus/0.log" Apr 23 18:12:17.552377 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:17.552335 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:12:27.552823 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:27.552781 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 18:12:37.552503 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:37.552476 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:12:44.927427 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.927392 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd"] Apr 23 18:12:44.927910 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.927655 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="storage-initializer" Apr 23 18:12:44.927910 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.927665 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="storage-initializer" Apr 23 18:12:44.927910 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.927674 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kube-rbac-proxy" Apr 23 18:12:44.927910 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.927681 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kube-rbac-proxy" Apr 23 18:12:44.927910 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.927701 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" Apr 23 18:12:44.927910 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.927708 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" Apr 23 18:12:44.927910 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.927770 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kserve-container" Apr 23 18:12:44.927910 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.927778 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="405f9c55-cdf2-4708-a0cf-3c8d34ebbf7c" containerName="kube-rbac-proxy" Apr 23 18:12:44.930751 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.930726 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:44.932791 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.932767 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-b0e9d2\"" Apr 23 18:12:44.932921 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.932842 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-b0e9d2-dockercfg-xbnnw\"" Apr 23 18:12:44.932921 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.932851 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-b0e9d2-predictor-serving-cert\"" Apr 23 18:12:44.933142 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.933125 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 18:12:44.933192 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.933172 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-b0e9d2-kube-rbac-proxy-sar-config\"" Apr 23 18:12:44.941319 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.941284 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd"] Apr 23 18:12:44.992333 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.992286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-b0e9d2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/669c3171-43f8-45d2-93aa-6fca7b7db984-isvc-secondary-b0e9d2-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:44.992516 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.992383 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/669c3171-43f8-45d2-93aa-6fca7b7db984-cabundle-cert\") pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:44.992516 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.992423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/669c3171-43f8-45d2-93aa-6fca7b7db984-proxy-tls\") pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:44.992516 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.992445 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/669c3171-43f8-45d2-93aa-6fca7b7db984-kserve-provision-location\") pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:44.992625 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:44.992531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p24z5\" (UniqueName: \"kubernetes.io/projected/669c3171-43f8-45d2-93aa-6fca7b7db984-kube-api-access-p24z5\") pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:45.093018 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:45.092977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-b0e9d2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/669c3171-43f8-45d2-93aa-6fca7b7db984-isvc-secondary-b0e9d2-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:45.093206 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:45.093071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/669c3171-43f8-45d2-93aa-6fca7b7db984-cabundle-cert\") pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:45.093206 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:45.093109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/669c3171-43f8-45d2-93aa-6fca7b7db984-proxy-tls\") pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:45.093206 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:45.093136 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/669c3171-43f8-45d2-93aa-6fca7b7db984-kserve-provision-location\") pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:45.093206 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:45.093190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p24z5\" (UniqueName: \"kubernetes.io/projected/669c3171-43f8-45d2-93aa-6fca7b7db984-kube-api-access-p24z5\") pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:45.093542 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:45.093513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/669c3171-43f8-45d2-93aa-6fca7b7db984-kserve-provision-location\") pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:45.093819 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:45.093800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/669c3171-43f8-45d2-93aa-6fca7b7db984-cabundle-cert\") pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:45.093889 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:45.093799 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-b0e9d2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/669c3171-43f8-45d2-93aa-6fca7b7db984-isvc-secondary-b0e9d2-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:45.095756 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:45.095718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/669c3171-43f8-45d2-93aa-6fca7b7db984-proxy-tls\") pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:45.101503 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:45.101464 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p24z5\" (UniqueName: \"kubernetes.io/projected/669c3171-43f8-45d2-93aa-6fca7b7db984-kube-api-access-p24z5\") pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:45.241420 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:45.241326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:12:45.370094 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:45.370064 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd"] Apr 23 18:12:45.372573 ip-10-0-132-102 kubenswrapper[2576]: W0423 18:12:45.372530 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod669c3171_43f8_45d2_93aa_6fca7b7db984.slice/crio-5b04f69fc3907d684400e09a59e79b0b116c674faa6f18b729dc2494cbf0d9fa WatchSource:0}: Error finding container 5b04f69fc3907d684400e09a59e79b0b116c674faa6f18b729dc2494cbf0d9fa: Status 404 returned error can't find the container with id 5b04f69fc3907d684400e09a59e79b0b116c674faa6f18b729dc2494cbf0d9fa Apr 23 18:12:45.374393 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:45.374378 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:12:45.742604 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:45.742567 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" event={"ID":"669c3171-43f8-45d2-93aa-6fca7b7db984","Type":"ContainerStarted","Data":"07c2d34017fe4f1f8213010810b92ef439a53901c5d8e1c4b098b25a14ac13c7"} Apr 23 18:12:45.742604 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:45.742608 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" event={"ID":"669c3171-43f8-45d2-93aa-6fca7b7db984","Type":"ContainerStarted","Data":"5b04f69fc3907d684400e09a59e79b0b116c674faa6f18b729dc2494cbf0d9fa"} Apr 23 18:12:48.771048 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:48.771020 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd_669c3171-43f8-45d2-93aa-6fca7b7db984/storage-initializer/0.log" Apr 23 18:12:48.771530 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:48.771059 2576 generic.go:358] "Generic (PLEG): container finished" podID="669c3171-43f8-45d2-93aa-6fca7b7db984" containerID="07c2d34017fe4f1f8213010810b92ef439a53901c5d8e1c4b098b25a14ac13c7" exitCode=1 Apr 23 18:12:48.771530 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:48.771149 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" event={"ID":"669c3171-43f8-45d2-93aa-6fca7b7db984","Type":"ContainerDied","Data":"07c2d34017fe4f1f8213010810b92ef439a53901c5d8e1c4b098b25a14ac13c7"} Apr 23 18:12:49.775073 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:49.775043 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd_669c3171-43f8-45d2-93aa-6fca7b7db984/storage-initializer/0.log" Apr 23 18:12:49.775441 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:49.775126 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" event={"ID":"669c3171-43f8-45d2-93aa-6fca7b7db984","Type":"ContainerStarted","Data":"b675dcdf34b7eb472a1d993e7d209aa7125a257cad1453ccab030ab31d81dbe6"} Apr 23 18:12:53.786352 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:53.786326 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd_669c3171-43f8-45d2-93aa-6fca7b7db984/storage-initializer/1.log" Apr 23 18:12:53.786776 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:53.786637 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd_669c3171-43f8-45d2-93aa-6fca7b7db984/storage-initializer/0.log" Apr 23 18:12:53.786776 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:53.786670 2576 generic.go:358] "Generic (PLEG): container finished" podID="669c3171-43f8-45d2-93aa-6fca7b7db984" containerID="b675dcdf34b7eb472a1d993e7d209aa7125a257cad1453ccab030ab31d81dbe6" exitCode=1 Apr 23 18:12:53.786776 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:53.786764 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" event={"ID":"669c3171-43f8-45d2-93aa-6fca7b7db984","Type":"ContainerDied","Data":"b675dcdf34b7eb472a1d993e7d209aa7125a257cad1453ccab030ab31d81dbe6"} Apr 23 18:12:53.786894 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:53.786807 2576 scope.go:117] "RemoveContainer" containerID="07c2d34017fe4f1f8213010810b92ef439a53901c5d8e1c4b098b25a14ac13c7" Apr 23 18:12:53.787245 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:53.787225 2576 scope.go:117] "RemoveContainer" containerID="07c2d34017fe4f1f8213010810b92ef439a53901c5d8e1c4b098b25a14ac13c7" Apr 23 18:12:53.796999 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:12:53.796968 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd_kserve-ci-e2e-test_669c3171-43f8-45d2-93aa-6fca7b7db984_0 in pod sandbox 5b04f69fc3907d684400e09a59e79b0b116c674faa6f18b729dc2494cbf0d9fa from index: no such id: '07c2d34017fe4f1f8213010810b92ef439a53901c5d8e1c4b098b25a14ac13c7'" containerID="07c2d34017fe4f1f8213010810b92ef439a53901c5d8e1c4b098b25a14ac13c7" Apr 23 18:12:53.797079 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:12:53.797026 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd_kserve-ci-e2e-test_669c3171-43f8-45d2-93aa-6fca7b7db984_0 in pod sandbox 5b04f69fc3907d684400e09a59e79b0b116c674faa6f18b729dc2494cbf0d9fa from index: no such id: '07c2d34017fe4f1f8213010810b92ef439a53901c5d8e1c4b098b25a14ac13c7'; Skipping pod \"isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd_kserve-ci-e2e-test(669c3171-43f8-45d2-93aa-6fca7b7db984)\"" logger="UnhandledError" Apr 23 18:12:53.798355 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:12:53.798335 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd_kserve-ci-e2e-test(669c3171-43f8-45d2-93aa-6fca7b7db984)\"" pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" podUID="669c3171-43f8-45d2-93aa-6fca7b7db984" Apr 23 18:12:54.790389 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:12:54.790322 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd_669c3171-43f8-45d2-93aa-6fca7b7db984/storage-initializer/1.log" Apr 23 18:13:00.948507 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:00.948473 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62"] Apr 23 18:13:00.948990 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:00.948905 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kserve-container" containerID="cri-o://51f91d580d3fbd55d2103cf98198fadb1e2326aa7d65c02f1c04eb0ee317c01b" gracePeriod=30 Apr 23 18:13:00.948990 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:00.948953 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kube-rbac-proxy" containerID="cri-o://a102c6fe8d7ee5bff4b597fa46dfe3fa5bd491518bb4082a808d8fe78747c273" gracePeriod=30 Apr 23 18:13:01.011712 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.011676 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd"] Apr 23 18:13:01.108484 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.108446 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk"] Apr 23 18:13:01.111793 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.111776 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.113960 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.113899 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-5052e1-predictor-serving-cert\"" Apr 23 18:13:01.114107 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.114028 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-5052e1-dockercfg-cstrb\"" Apr 23 18:13:01.114107 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.114073 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-5052e1\"" Apr 23 18:13:01.114285 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.114025 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-5052e1-kube-rbac-proxy-sar-config\"" Apr 23 18:13:01.121012 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.120984 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk"] Apr 23 18:13:01.140793 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.140771 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd_669c3171-43f8-45d2-93aa-6fca7b7db984/storage-initializer/1.log" Apr 23 18:13:01.140920 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.140832 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:13:01.231512 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.231403 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/669c3171-43f8-45d2-93aa-6fca7b7db984-proxy-tls\") pod \"669c3171-43f8-45d2-93aa-6fca7b7db984\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " Apr 23 18:13:01.231512 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.231471 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-b0e9d2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/669c3171-43f8-45d2-93aa-6fca7b7db984-isvc-secondary-b0e9d2-kube-rbac-proxy-sar-config\") pod \"669c3171-43f8-45d2-93aa-6fca7b7db984\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " Apr 23 18:13:01.231775 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.231540 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/669c3171-43f8-45d2-93aa-6fca7b7db984-kserve-provision-location\") pod \"669c3171-43f8-45d2-93aa-6fca7b7db984\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " Apr 23 18:13:01.231775 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.231569 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p24z5\" (UniqueName: \"kubernetes.io/projected/669c3171-43f8-45d2-93aa-6fca7b7db984-kube-api-access-p24z5\") pod \"669c3171-43f8-45d2-93aa-6fca7b7db984\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " Apr 23 18:13:01.231775 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.231607 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/669c3171-43f8-45d2-93aa-6fca7b7db984-cabundle-cert\") pod \"669c3171-43f8-45d2-93aa-6fca7b7db984\" (UID: \"669c3171-43f8-45d2-93aa-6fca7b7db984\") " Apr 23 18:13:01.231775 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.231727 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2c4m\" (UniqueName: \"kubernetes.io/projected/6e1c76fc-0856-4d86-8411-8e1687f408a4-kube-api-access-b2c4m\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.231997 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.231788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6e1c76fc-0856-4d86-8411-8e1687f408a4-cabundle-cert\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.231997 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.231834 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/669c3171-43f8-45d2-93aa-6fca7b7db984-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "669c3171-43f8-45d2-93aa-6fca7b7db984" (UID: "669c3171-43f8-45d2-93aa-6fca7b7db984"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:13:01.231997 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.231852 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e1c76fc-0856-4d86-8411-8e1687f408a4-proxy-tls\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.231997 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.231925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-5052e1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e1c76fc-0856-4d86-8411-8e1687f408a4-isvc-init-fail-5052e1-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.231997 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.231967 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e1c76fc-0856-4d86-8411-8e1687f408a4-kserve-provision-location\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.231997 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.231970 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/669c3171-43f8-45d2-93aa-6fca7b7db984-isvc-secondary-b0e9d2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-b0e9d2-kube-rbac-proxy-sar-config") pod "669c3171-43f8-45d2-93aa-6fca7b7db984" (UID: "669c3171-43f8-45d2-93aa-6fca7b7db984"). InnerVolumeSpecName "isvc-secondary-b0e9d2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:13:01.232207 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.232030 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/669c3171-43f8-45d2-93aa-6fca7b7db984-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "669c3171-43f8-45d2-93aa-6fca7b7db984" (UID: "669c3171-43f8-45d2-93aa-6fca7b7db984"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:13:01.232207 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.232061 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-b0e9d2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/669c3171-43f8-45d2-93aa-6fca7b7db984-isvc-secondary-b0e9d2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:13:01.232207 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.232082 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/669c3171-43f8-45d2-93aa-6fca7b7db984-kserve-provision-location\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:13:01.233613 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.233593 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/669c3171-43f8-45d2-93aa-6fca7b7db984-kube-api-access-p24z5" (OuterVolumeSpecName: "kube-api-access-p24z5") pod "669c3171-43f8-45d2-93aa-6fca7b7db984" (UID: "669c3171-43f8-45d2-93aa-6fca7b7db984"). InnerVolumeSpecName "kube-api-access-p24z5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:13:01.233667 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.233614 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/669c3171-43f8-45d2-93aa-6fca7b7db984-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "669c3171-43f8-45d2-93aa-6fca7b7db984" (UID: "669c3171-43f8-45d2-93aa-6fca7b7db984"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:13:01.332968 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.332915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-5052e1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e1c76fc-0856-4d86-8411-8e1687f408a4-isvc-init-fail-5052e1-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.332968 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.332976 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e1c76fc-0856-4d86-8411-8e1687f408a4-kserve-provision-location\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.333232 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.333030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2c4m\" (UniqueName: \"kubernetes.io/projected/6e1c76fc-0856-4d86-8411-8e1687f408a4-kube-api-access-b2c4m\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.333232 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.333056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6e1c76fc-0856-4d86-8411-8e1687f408a4-cabundle-cert\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.333232 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.333098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e1c76fc-0856-4d86-8411-8e1687f408a4-proxy-tls\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.333232 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.333153 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p24z5\" (UniqueName: \"kubernetes.io/projected/669c3171-43f8-45d2-93aa-6fca7b7db984-kube-api-access-p24z5\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:13:01.333232 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.333168 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/669c3171-43f8-45d2-93aa-6fca7b7db984-cabundle-cert\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:13:01.333232 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.333183 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/669c3171-43f8-45d2-93aa-6fca7b7db984-proxy-tls\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:13:01.333476 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:13:01.333270 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-serving-cert: secret "isvc-init-fail-5052e1-predictor-serving-cert" not found Apr 23 18:13:01.333476 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:13:01.333344 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e1c76fc-0856-4d86-8411-8e1687f408a4-proxy-tls podName:6e1c76fc-0856-4d86-8411-8e1687f408a4 nodeName:}" failed. No retries permitted until 2026-04-23 18:13:01.833323597 +0000 UTC m=+1252.068770292 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6e1c76fc-0856-4d86-8411-8e1687f408a4-proxy-tls") pod "isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" (UID: "6e1c76fc-0856-4d86-8411-8e1687f408a4") : secret "isvc-init-fail-5052e1-predictor-serving-cert" not found Apr 23 18:13:01.333476 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.333393 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e1c76fc-0856-4d86-8411-8e1687f408a4-kserve-provision-location\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.333578 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.333536 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-5052e1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e1c76fc-0856-4d86-8411-8e1687f408a4-isvc-init-fail-5052e1-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.333649 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.333628 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6e1c76fc-0856-4d86-8411-8e1687f408a4-cabundle-cert\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.343848 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.343829 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2c4m\" (UniqueName: \"kubernetes.io/projected/6e1c76fc-0856-4d86-8411-8e1687f408a4-kube-api-access-b2c4m\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.809696 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.809661 2576 generic.go:358] "Generic (PLEG): container finished" podID="b819f9d6-5d46-4210-96a1-25228eae230f" containerID="a102c6fe8d7ee5bff4b597fa46dfe3fa5bd491518bb4082a808d8fe78747c273" exitCode=2 Apr 23 18:13:01.809892 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.809728 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" event={"ID":"b819f9d6-5d46-4210-96a1-25228eae230f","Type":"ContainerDied","Data":"a102c6fe8d7ee5bff4b597fa46dfe3fa5bd491518bb4082a808d8fe78747c273"} Apr 23 18:13:01.810795 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.810773 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd_669c3171-43f8-45d2-93aa-6fca7b7db984/storage-initializer/1.log" Apr 23 18:13:01.810922 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.810819 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" event={"ID":"669c3171-43f8-45d2-93aa-6fca7b7db984","Type":"ContainerDied","Data":"5b04f69fc3907d684400e09a59e79b0b116c674faa6f18b729dc2494cbf0d9fa"} Apr 23 18:13:01.810922 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.810854 2576 scope.go:117] "RemoveContainer" containerID="b675dcdf34b7eb472a1d993e7d209aa7125a257cad1453ccab030ab31d81dbe6" Apr 23 18:13:01.810922 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.810879 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd" Apr 23 18:13:01.837580 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.837548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e1c76fc-0856-4d86-8411-8e1687f408a4-proxy-tls\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.839844 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.839826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e1c76fc-0856-4d86-8411-8e1687f408a4-proxy-tls\") pod \"isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:01.851331 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.851300 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd"] Apr 23 18:13:01.856693 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:01.856667 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b0e9d2-predictor-6db877c85c-8v7gd"] Apr 23 18:13:02.024327 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:02.024293 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:02.146615 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:02.146548 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk"] Apr 23 18:13:02.149471 ip-10-0-132-102 kubenswrapper[2576]: W0423 18:13:02.149445 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e1c76fc_0856_4d86_8411_8e1687f408a4.slice/crio-e965ecc1f9f0a424452caf72fa718e7ace309b6f41c4ebfbd89dc164ce7fd1b0 WatchSource:0}: Error finding container e965ecc1f9f0a424452caf72fa718e7ace309b6f41c4ebfbd89dc164ce7fd1b0: Status 404 returned error can't find the container with id e965ecc1f9f0a424452caf72fa718e7ace309b6f41c4ebfbd89dc164ce7fd1b0 Apr 23 18:13:02.393650 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:02.393617 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="669c3171-43f8-45d2-93aa-6fca7b7db984" path="/var/lib/kubelet/pods/669c3171-43f8-45d2-93aa-6fca7b7db984/volumes" Apr 23 18:13:02.547369 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:02.547330 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 23 18:13:02.815725 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:02.815634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" event={"ID":"6e1c76fc-0856-4d86-8411-8e1687f408a4","Type":"ContainerStarted","Data":"fa6c55fe684cb28f1f10b98cb53417816871106c3622b785aadca18232370c57"} Apr 23 18:13:02.815725 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:02.815678 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" event={"ID":"6e1c76fc-0856-4d86-8411-8e1687f408a4","Type":"ContainerStarted","Data":"e965ecc1f9f0a424452caf72fa718e7ace309b6f41c4ebfbd89dc164ce7fd1b0"} Apr 23 18:13:05.096976 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.096951 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:13:05.266623 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.266533 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjdft\" (UniqueName: \"kubernetes.io/projected/b819f9d6-5d46-4210-96a1-25228eae230f-kube-api-access-wjdft\") pod \"b819f9d6-5d46-4210-96a1-25228eae230f\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " Apr 23 18:13:05.266623 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.266580 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-b0e9d2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b819f9d6-5d46-4210-96a1-25228eae230f-isvc-primary-b0e9d2-kube-rbac-proxy-sar-config\") pod \"b819f9d6-5d46-4210-96a1-25228eae230f\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " Apr 23 18:13:05.266623 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.266622 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b819f9d6-5d46-4210-96a1-25228eae230f-kserve-provision-location\") pod \"b819f9d6-5d46-4210-96a1-25228eae230f\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " Apr 23 18:13:05.266945 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.266655 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b819f9d6-5d46-4210-96a1-25228eae230f-proxy-tls\") pod \"b819f9d6-5d46-4210-96a1-25228eae230f\" (UID: \"b819f9d6-5d46-4210-96a1-25228eae230f\") " Apr 23 18:13:05.267010 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.266974 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b819f9d6-5d46-4210-96a1-25228eae230f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b819f9d6-5d46-4210-96a1-25228eae230f" (UID: "b819f9d6-5d46-4210-96a1-25228eae230f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:13:05.267067 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.267026 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b819f9d6-5d46-4210-96a1-25228eae230f-isvc-primary-b0e9d2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-b0e9d2-kube-rbac-proxy-sar-config") pod "b819f9d6-5d46-4210-96a1-25228eae230f" (UID: "b819f9d6-5d46-4210-96a1-25228eae230f"). InnerVolumeSpecName "isvc-primary-b0e9d2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:13:05.268727 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.268700 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b819f9d6-5d46-4210-96a1-25228eae230f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b819f9d6-5d46-4210-96a1-25228eae230f" (UID: "b819f9d6-5d46-4210-96a1-25228eae230f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:13:05.268863 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.268788 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b819f9d6-5d46-4210-96a1-25228eae230f-kube-api-access-wjdft" (OuterVolumeSpecName: "kube-api-access-wjdft") pod "b819f9d6-5d46-4210-96a1-25228eae230f" (UID: "b819f9d6-5d46-4210-96a1-25228eae230f"). InnerVolumeSpecName "kube-api-access-wjdft". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:13:05.367299 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.367261 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b819f9d6-5d46-4210-96a1-25228eae230f-proxy-tls\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:13:05.367299 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.367291 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wjdft\" (UniqueName: \"kubernetes.io/projected/b819f9d6-5d46-4210-96a1-25228eae230f-kube-api-access-wjdft\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:13:05.367299 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.367303 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-b0e9d2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b819f9d6-5d46-4210-96a1-25228eae230f-isvc-primary-b0e9d2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:13:05.367528 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.367313 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b819f9d6-5d46-4210-96a1-25228eae230f-kserve-provision-location\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:13:05.824904 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.824872 2576 generic.go:358] "Generic (PLEG): container finished" podID="b819f9d6-5d46-4210-96a1-25228eae230f" containerID="51f91d580d3fbd55d2103cf98198fadb1e2326aa7d65c02f1c04eb0ee317c01b" exitCode=0 Apr 23 18:13:05.825090 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.824926 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" event={"ID":"b819f9d6-5d46-4210-96a1-25228eae230f","Type":"ContainerDied","Data":"51f91d580d3fbd55d2103cf98198fadb1e2326aa7d65c02f1c04eb0ee317c01b"} Apr 23 18:13:05.825090 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.824951 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" Apr 23 18:13:05.825090 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.824961 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62" event={"ID":"b819f9d6-5d46-4210-96a1-25228eae230f","Type":"ContainerDied","Data":"805147926d2d43515cd835a423c65ba77cc97dbfddd5d5bbea8191b91322d998"} Apr 23 18:13:05.825090 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.824978 2576 scope.go:117] "RemoveContainer" containerID="a102c6fe8d7ee5bff4b597fa46dfe3fa5bd491518bb4082a808d8fe78747c273" Apr 23 18:13:05.832956 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.832593 2576 scope.go:117] "RemoveContainer" containerID="51f91d580d3fbd55d2103cf98198fadb1e2326aa7d65c02f1c04eb0ee317c01b" Apr 23 18:13:05.839474 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.839458 2576 scope.go:117] "RemoveContainer" containerID="835199f614880428bb1ff8b457884ed5224f386eec7198f1641869f9132d19ec" Apr 23 18:13:05.845969 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.845946 2576 scope.go:117] "RemoveContainer" containerID="a102c6fe8d7ee5bff4b597fa46dfe3fa5bd491518bb4082a808d8fe78747c273" Apr 23 18:13:05.846222 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:13:05.846199 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a102c6fe8d7ee5bff4b597fa46dfe3fa5bd491518bb4082a808d8fe78747c273\": container with ID starting with a102c6fe8d7ee5bff4b597fa46dfe3fa5bd491518bb4082a808d8fe78747c273 not found: ID does not exist" containerID="a102c6fe8d7ee5bff4b597fa46dfe3fa5bd491518bb4082a808d8fe78747c273" Apr 23 18:13:05.846283 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.846235 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a102c6fe8d7ee5bff4b597fa46dfe3fa5bd491518bb4082a808d8fe78747c273"} err="failed to get container status \"a102c6fe8d7ee5bff4b597fa46dfe3fa5bd491518bb4082a808d8fe78747c273\": rpc error: code = NotFound desc = could not find container \"a102c6fe8d7ee5bff4b597fa46dfe3fa5bd491518bb4082a808d8fe78747c273\": container with ID starting with a102c6fe8d7ee5bff4b597fa46dfe3fa5bd491518bb4082a808d8fe78747c273 not found: ID does not exist" Apr 23 18:13:05.846283 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.846261 2576 scope.go:117] "RemoveContainer" containerID="51f91d580d3fbd55d2103cf98198fadb1e2326aa7d65c02f1c04eb0ee317c01b" Apr 23 18:13:05.846507 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:13:05.846489 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51f91d580d3fbd55d2103cf98198fadb1e2326aa7d65c02f1c04eb0ee317c01b\": container with ID starting with 51f91d580d3fbd55d2103cf98198fadb1e2326aa7d65c02f1c04eb0ee317c01b not found: ID does not exist" containerID="51f91d580d3fbd55d2103cf98198fadb1e2326aa7d65c02f1c04eb0ee317c01b" Apr 23 18:13:05.846548 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.846514 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f91d580d3fbd55d2103cf98198fadb1e2326aa7d65c02f1c04eb0ee317c01b"} err="failed to get container status \"51f91d580d3fbd55d2103cf98198fadb1e2326aa7d65c02f1c04eb0ee317c01b\": rpc error: code = NotFound desc = could not find container \"51f91d580d3fbd55d2103cf98198fadb1e2326aa7d65c02f1c04eb0ee317c01b\": container with ID starting with 51f91d580d3fbd55d2103cf98198fadb1e2326aa7d65c02f1c04eb0ee317c01b not found: ID does not exist" Apr 23 18:13:05.846548 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.846531 2576 scope.go:117] "RemoveContainer" containerID="835199f614880428bb1ff8b457884ed5224f386eec7198f1641869f9132d19ec" Apr 23 18:13:05.846791 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:13:05.846771 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"835199f614880428bb1ff8b457884ed5224f386eec7198f1641869f9132d19ec\": container with ID starting with 835199f614880428bb1ff8b457884ed5224f386eec7198f1641869f9132d19ec not found: ID does not exist" containerID="835199f614880428bb1ff8b457884ed5224f386eec7198f1641869f9132d19ec" Apr 23 18:13:05.846872 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.846794 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835199f614880428bb1ff8b457884ed5224f386eec7198f1641869f9132d19ec"} err="failed to get container status \"835199f614880428bb1ff8b457884ed5224f386eec7198f1641869f9132d19ec\": rpc error: code = NotFound desc = could not find container \"835199f614880428bb1ff8b457884ed5224f386eec7198f1641869f9132d19ec\": container with ID starting with 835199f614880428bb1ff8b457884ed5224f386eec7198f1641869f9132d19ec not found: ID does not exist" Apr 23 18:13:05.846872 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.846847 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62"] Apr 23 18:13:05.850693 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:05.850674 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b0e9d2-predictor-54c659f8-4mp62"] Apr 23 18:13:06.393784 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:06.393727 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" path="/var/lib/kubelet/pods/b819f9d6-5d46-4210-96a1-25228eae230f/volumes" Apr 23 18:13:07.832474 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:07.832398 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk_6e1c76fc-0856-4d86-8411-8e1687f408a4/storage-initializer/0.log" Apr 23 18:13:07.832474 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:07.832436 2576 generic.go:358] "Generic (PLEG): container finished" podID="6e1c76fc-0856-4d86-8411-8e1687f408a4" containerID="fa6c55fe684cb28f1f10b98cb53417816871106c3622b785aadca18232370c57" exitCode=1 Apr 23 18:13:07.832868 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:07.832518 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" event={"ID":"6e1c76fc-0856-4d86-8411-8e1687f408a4","Type":"ContainerDied","Data":"fa6c55fe684cb28f1f10b98cb53417816871106c3622b785aadca18232370c57"} Apr 23 18:13:08.836801 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:08.836770 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk_6e1c76fc-0856-4d86-8411-8e1687f408a4/storage-initializer/0.log" Apr 23 18:13:08.837184 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:08.836873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" event={"ID":"6e1c76fc-0856-4d86-8411-8e1687f408a4","Type":"ContainerStarted","Data":"d08042adb77e0bb29c375c9a99427881ec82e1bffe169b7b9cecd84b785b7233"} Apr 23 18:13:11.127990 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.127959 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk"] Apr 23 18:13:11.128424 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.128343 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" podUID="6e1c76fc-0856-4d86-8411-8e1687f408a4" containerName="storage-initializer" containerID="cri-o://d08042adb77e0bb29c375c9a99427881ec82e1bffe169b7b9cecd84b785b7233" gracePeriod=30 Apr 23 18:13:11.244329 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.244285 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5"] Apr 23 18:13:11.244565 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.244553 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kserve-container" Apr 23 18:13:11.244620 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.244567 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kserve-container" Apr 23 18:13:11.244620 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.244581 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="storage-initializer" Apr 23 18:13:11.244620 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.244591 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="storage-initializer" Apr 23 18:13:11.244620 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.244603 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="669c3171-43f8-45d2-93aa-6fca7b7db984" containerName="storage-initializer" Apr 23 18:13:11.244620 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.244609 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="669c3171-43f8-45d2-93aa-6fca7b7db984" containerName="storage-initializer" Apr 23 18:13:11.244620 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.244616 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="669c3171-43f8-45d2-93aa-6fca7b7db984" containerName="storage-initializer" Apr 23 18:13:11.244620 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.244621 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="669c3171-43f8-45d2-93aa-6fca7b7db984" containerName="storage-initializer" Apr 23 18:13:11.244842 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.244629 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kube-rbac-proxy" Apr 23 18:13:11.244842 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.244634 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kube-rbac-proxy" Apr 23 18:13:11.244842 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.244690 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kserve-container" Apr 23 18:13:11.244842 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.244697 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="669c3171-43f8-45d2-93aa-6fca7b7db984" containerName="storage-initializer" Apr 23 18:13:11.244842 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.244706 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b819f9d6-5d46-4210-96a1-25228eae230f" containerName="kube-rbac-proxy" Apr 23 18:13:11.244842 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.244813 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="669c3171-43f8-45d2-93aa-6fca7b7db984" containerName="storage-initializer" Apr 23 18:13:11.249017 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.249000 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:11.250928 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.250906 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-0f5e3-predictor-serving-cert\"" Apr 23 18:13:11.251022 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.250977 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-0f5e3-kube-rbac-proxy-sar-config\"" Apr 23 18:13:11.251075 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.251037 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-9mp96\"" Apr 23 18:13:11.256515 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.256493 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5"] Apr 23 18:13:11.416366 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.416270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d4b9978-964d-419e-b6c2-4693a1625360-kserve-provision-location\") pod \"raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:11.416366 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.416316 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-0f5e3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d4b9978-964d-419e-b6c2-4693a1625360-raw-sklearn-0f5e3-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:11.416572 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.416394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62cn9\" (UniqueName: \"kubernetes.io/projected/2d4b9978-964d-419e-b6c2-4693a1625360-kube-api-access-62cn9\") pod \"raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:11.416572 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.416456 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d4b9978-964d-419e-b6c2-4693a1625360-proxy-tls\") pod \"raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:11.517727 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.517689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d4b9978-964d-419e-b6c2-4693a1625360-proxy-tls\") pod \"raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:11.517902 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.517759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d4b9978-964d-419e-b6c2-4693a1625360-kserve-provision-location\") pod \"raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:11.517902 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.517784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-0f5e3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d4b9978-964d-419e-b6c2-4693a1625360-raw-sklearn-0f5e3-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:11.517902 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.517813 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62cn9\" (UniqueName: \"kubernetes.io/projected/2d4b9978-964d-419e-b6c2-4693a1625360-kube-api-access-62cn9\") pod \"raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:11.518196 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.518170 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d4b9978-964d-419e-b6c2-4693a1625360-kserve-provision-location\") pod \"raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:11.518497 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.518474 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-0f5e3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d4b9978-964d-419e-b6c2-4693a1625360-raw-sklearn-0f5e3-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:11.520167 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.520145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d4b9978-964d-419e-b6c2-4693a1625360-proxy-tls\") pod \"raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:11.525479 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.525457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62cn9\" (UniqueName: \"kubernetes.io/projected/2d4b9978-964d-419e-b6c2-4693a1625360-kube-api-access-62cn9\") pod \"raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:11.559929 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.559904 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:11.677843 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.677821 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5"] Apr 23 18:13:11.680059 ip-10-0-132-102 kubenswrapper[2576]: W0423 18:13:11.680028 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d4b9978_964d_419e_b6c2_4693a1625360.slice/crio-a4b4ec3c99ff136a1115a216aa7026a4beefeeb123cb3efb2a9c4361e717d3ba WatchSource:0}: Error finding container a4b4ec3c99ff136a1115a216aa7026a4beefeeb123cb3efb2a9c4361e717d3ba: Status 404 returned error can't find the container with id a4b4ec3c99ff136a1115a216aa7026a4beefeeb123cb3efb2a9c4361e717d3ba Apr 23 18:13:11.846401 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.846358 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" event={"ID":"2d4b9978-964d-419e-b6c2-4693a1625360","Type":"ContainerStarted","Data":"43f06ff2c2266cd8edd32b205b1ef08753c46dfaf54fe8b91f072701231d8a60"} Apr 23 18:13:11.846401 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:11.846403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" event={"ID":"2d4b9978-964d-419e-b6c2-4693a1625360","Type":"ContainerStarted","Data":"a4b4ec3c99ff136a1115a216aa7026a4beefeeb123cb3efb2a9c4361e717d3ba"} Apr 23 18:13:13.855660 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:13.855634 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk_6e1c76fc-0856-4d86-8411-8e1687f408a4/storage-initializer/1.log" Apr 23 18:13:13.856063 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:13.856047 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk_6e1c76fc-0856-4d86-8411-8e1687f408a4/storage-initializer/0.log" Apr 23 18:13:13.856108 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:13.856080 2576 generic.go:358] "Generic (PLEG): container finished" podID="6e1c76fc-0856-4d86-8411-8e1687f408a4" containerID="d08042adb77e0bb29c375c9a99427881ec82e1bffe169b7b9cecd84b785b7233" exitCode=1 Apr 23 18:13:13.856157 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:13.856110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" event={"ID":"6e1c76fc-0856-4d86-8411-8e1687f408a4","Type":"ContainerDied","Data":"d08042adb77e0bb29c375c9a99427881ec82e1bffe169b7b9cecd84b785b7233"} Apr 23 18:13:13.856157 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:13.856140 2576 scope.go:117] "RemoveContainer" containerID="fa6c55fe684cb28f1f10b98cb53417816871106c3622b785aadca18232370c57" Apr 23 18:13:13.962695 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:13.962672 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk_6e1c76fc-0856-4d86-8411-8e1687f408a4/storage-initializer/1.log" Apr 23 18:13:13.962824 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:13.962760 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:14.135047 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.135012 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2c4m\" (UniqueName: \"kubernetes.io/projected/6e1c76fc-0856-4d86-8411-8e1687f408a4-kube-api-access-b2c4m\") pod \"6e1c76fc-0856-4d86-8411-8e1687f408a4\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " Apr 23 18:13:14.135236 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.135083 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e1c76fc-0856-4d86-8411-8e1687f408a4-proxy-tls\") pod \"6e1c76fc-0856-4d86-8411-8e1687f408a4\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " Apr 23 18:13:14.135236 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.135117 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6e1c76fc-0856-4d86-8411-8e1687f408a4-cabundle-cert\") pod \"6e1c76fc-0856-4d86-8411-8e1687f408a4\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " Apr 23 18:13:14.135236 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.135145 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-5052e1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e1c76fc-0856-4d86-8411-8e1687f408a4-isvc-init-fail-5052e1-kube-rbac-proxy-sar-config\") pod \"6e1c76fc-0856-4d86-8411-8e1687f408a4\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " Apr 23 18:13:14.135399 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.135283 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e1c76fc-0856-4d86-8411-8e1687f408a4-kserve-provision-location\") pod \"6e1c76fc-0856-4d86-8411-8e1687f408a4\" (UID: \"6e1c76fc-0856-4d86-8411-8e1687f408a4\") " Apr 23 18:13:14.135554 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.135531 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e1c76fc-0856-4d86-8411-8e1687f408a4-isvc-init-fail-5052e1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-5052e1-kube-rbac-proxy-sar-config") pod "6e1c76fc-0856-4d86-8411-8e1687f408a4" (UID: "6e1c76fc-0856-4d86-8411-8e1687f408a4"). InnerVolumeSpecName "isvc-init-fail-5052e1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:13:14.135647 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.135527 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e1c76fc-0856-4d86-8411-8e1687f408a4-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "6e1c76fc-0856-4d86-8411-8e1687f408a4" (UID: "6e1c76fc-0856-4d86-8411-8e1687f408a4"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:13:14.135647 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.135536 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1c76fc-0856-4d86-8411-8e1687f408a4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6e1c76fc-0856-4d86-8411-8e1687f408a4" (UID: "6e1c76fc-0856-4d86-8411-8e1687f408a4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:13:14.137247 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.137227 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1c76fc-0856-4d86-8411-8e1687f408a4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6e1c76fc-0856-4d86-8411-8e1687f408a4" (UID: "6e1c76fc-0856-4d86-8411-8e1687f408a4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:13:14.137316 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.137299 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1c76fc-0856-4d86-8411-8e1687f408a4-kube-api-access-b2c4m" (OuterVolumeSpecName: "kube-api-access-b2c4m") pod "6e1c76fc-0856-4d86-8411-8e1687f408a4" (UID: "6e1c76fc-0856-4d86-8411-8e1687f408a4"). InnerVolumeSpecName "kube-api-access-b2c4m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:13:14.236061 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.236003 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b2c4m\" (UniqueName: \"kubernetes.io/projected/6e1c76fc-0856-4d86-8411-8e1687f408a4-kube-api-access-b2c4m\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:13:14.236061 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.236055 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e1c76fc-0856-4d86-8411-8e1687f408a4-proxy-tls\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:13:14.236061 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.236069 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6e1c76fc-0856-4d86-8411-8e1687f408a4-cabundle-cert\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:13:14.236314 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.236083 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-5052e1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e1c76fc-0856-4d86-8411-8e1687f408a4-isvc-init-fail-5052e1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:13:14.236314 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.236096 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e1c76fc-0856-4d86-8411-8e1687f408a4-kserve-provision-location\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:13:14.859674 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.859599 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk_6e1c76fc-0856-4d86-8411-8e1687f408a4/storage-initializer/1.log" Apr 23 18:13:14.860110 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.859681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" event={"ID":"6e1c76fc-0856-4d86-8411-8e1687f408a4","Type":"ContainerDied","Data":"e965ecc1f9f0a424452caf72fa718e7ace309b6f41c4ebfbd89dc164ce7fd1b0"} Apr 23 18:13:14.860110 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.859708 2576 scope.go:117] "RemoveContainer" containerID="d08042adb77e0bb29c375c9a99427881ec82e1bffe169b7b9cecd84b785b7233" Apr 23 18:13:14.860110 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.859722 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk" Apr 23 18:13:14.889807 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.889778 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk"] Apr 23 18:13:14.893050 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:14.893024 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-5052e1-predictor-68cfc88b5c-4jqgk"] Apr 23 18:13:15.864452 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:15.864419 2576 generic.go:358] "Generic (PLEG): container finished" podID="2d4b9978-964d-419e-b6c2-4693a1625360" containerID="43f06ff2c2266cd8edd32b205b1ef08753c46dfaf54fe8b91f072701231d8a60" exitCode=0 Apr 23 18:13:15.864956 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:15.864457 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" event={"ID":"2d4b9978-964d-419e-b6c2-4693a1625360","Type":"ContainerDied","Data":"43f06ff2c2266cd8edd32b205b1ef08753c46dfaf54fe8b91f072701231d8a60"} Apr 23 18:13:16.393508 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:16.393473 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e1c76fc-0856-4d86-8411-8e1687f408a4" path="/var/lib/kubelet/pods/6e1c76fc-0856-4d86-8411-8e1687f408a4/volumes" Apr 23 18:13:16.869536 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:16.869504 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" event={"ID":"2d4b9978-964d-419e-b6c2-4693a1625360","Type":"ContainerStarted","Data":"a21fbf62be60f2dbe7ce32187648dcff59a1d1927010d5aa8fe01d526a28922c"} Apr 23 18:13:16.870019 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:16.869546 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" event={"ID":"2d4b9978-964d-419e-b6c2-4693a1625360","Type":"ContainerStarted","Data":"a0d8b9404b6594753b01f7067e02faf37ddb4fed49523816c347e79fad224027"} Apr 23 18:13:16.870019 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:16.869766 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:16.887830 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:16.887794 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" podStartSLOduration=5.887781848 podStartE2EDuration="5.887781848s" podCreationTimestamp="2026-04-23 18:13:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:13:16.887009955 +0000 UTC m=+1267.122456659" watchObservedRunningTime="2026-04-23 18:13:16.887781848 +0000 UTC m=+1267.123228548" Apr 23 18:13:17.872968 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:17.872937 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:17.874216 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:17.874186 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:13:18.875240 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:18.875195 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:13:23.879907 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:23.879880 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:13:23.880469 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:23.880443 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:13:33.880659 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:33.880617 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:13:43.880972 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:43.880932 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:13:53.880649 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:13:53.880609 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:14:03.881338 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:03.881296 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:14:13.880842 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:13.880798 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:14:23.881825 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:23.881795 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:14:31.467281 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.467251 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5"] Apr 23 18:14:31.467792 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.467567 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kserve-container" containerID="cri-o://a0d8b9404b6594753b01f7067e02faf37ddb4fed49523816c347e79fad224027" gracePeriod=30 Apr 23 18:14:31.467792 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.467592 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kube-rbac-proxy" containerID="cri-o://a21fbf62be60f2dbe7ce32187648dcff59a1d1927010d5aa8fe01d526a28922c" gracePeriod=30 Apr 23 18:14:31.492162 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.492131 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm"] Apr 23 18:14:31.492414 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.492403 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e1c76fc-0856-4d86-8411-8e1687f408a4" containerName="storage-initializer" Apr 23 18:14:31.492460 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.492416 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1c76fc-0856-4d86-8411-8e1687f408a4" containerName="storage-initializer" Apr 23 18:14:31.492495 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.492469 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e1c76fc-0856-4d86-8411-8e1687f408a4" containerName="storage-initializer" Apr 23 18:14:31.492530 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.492510 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e1c76fc-0856-4d86-8411-8e1687f408a4" containerName="storage-initializer" Apr 23 18:14:31.492530 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.492516 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1c76fc-0856-4d86-8411-8e1687f408a4" containerName="storage-initializer" Apr 23 18:14:31.492592 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.492581 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e1c76fc-0856-4d86-8411-8e1687f408a4" containerName="storage-initializer" Apr 23 18:14:31.495388 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.495373 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:31.498711 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.498693 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-5ccec-predictor-serving-cert\"" Apr 23 18:14:31.498813 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.498694 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-5ccec-kube-rbac-proxy-sar-config\"" Apr 23 18:14:31.510840 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.510820 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm"] Apr 23 18:14:31.630485 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.630448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-runtime-5ccec-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a3b6fb6-7d51-41b7-bd08-85493b58d372-raw-sklearn-runtime-5ccec-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:31.630638 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.630509 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a3b6fb6-7d51-41b7-bd08-85493b58d372-proxy-tls\") pod \"raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:31.630638 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.630549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t62dz\" (UniqueName: \"kubernetes.io/projected/9a3b6fb6-7d51-41b7-bd08-85493b58d372-kube-api-access-t62dz\") pod \"raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:31.630638 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.630590 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a3b6fb6-7d51-41b7-bd08-85493b58d372-kserve-provision-location\") pod \"raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:31.731227 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.731144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a3b6fb6-7d51-41b7-bd08-85493b58d372-proxy-tls\") pod \"raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:31.731227 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.731195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t62dz\" (UniqueName: \"kubernetes.io/projected/9a3b6fb6-7d51-41b7-bd08-85493b58d372-kube-api-access-t62dz\") pod \"raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:31.731427 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.731236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a3b6fb6-7d51-41b7-bd08-85493b58d372-kserve-provision-location\") pod \"raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:31.731427 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.731281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-runtime-5ccec-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a3b6fb6-7d51-41b7-bd08-85493b58d372-raw-sklearn-runtime-5ccec-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:31.731653 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.731633 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a3b6fb6-7d51-41b7-bd08-85493b58d372-kserve-provision-location\") pod \"raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:31.731932 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.731911 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-runtime-5ccec-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a3b6fb6-7d51-41b7-bd08-85493b58d372-raw-sklearn-runtime-5ccec-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:31.733495 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.733479 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a3b6fb6-7d51-41b7-bd08-85493b58d372-proxy-tls\") pod \"raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:31.740629 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.740608 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t62dz\" (UniqueName: \"kubernetes.io/projected/9a3b6fb6-7d51-41b7-bd08-85493b58d372-kube-api-access-t62dz\") pod \"raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:31.804165 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.804136 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:31.928000 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:31.927870 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm"] Apr 23 18:14:31.930623 ip-10-0-132-102 kubenswrapper[2576]: W0423 18:14:31.930589 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a3b6fb6_7d51_41b7_bd08_85493b58d372.slice/crio-2ecd99f961ada0494b9fa724877a2cc73e724a3b382d0095be0a57b5068d85b0 WatchSource:0}: Error finding container 2ecd99f961ada0494b9fa724877a2cc73e724a3b382d0095be0a57b5068d85b0: Status 404 returned error can't find the container with id 2ecd99f961ada0494b9fa724877a2cc73e724a3b382d0095be0a57b5068d85b0 Apr 23 18:14:32.072801 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:32.072762 2576 generic.go:358] "Generic (PLEG): container finished" podID="2d4b9978-964d-419e-b6c2-4693a1625360" containerID="a21fbf62be60f2dbe7ce32187648dcff59a1d1927010d5aa8fe01d526a28922c" exitCode=2 Apr 23 18:14:32.072978 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:32.072800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" event={"ID":"2d4b9978-964d-419e-b6c2-4693a1625360","Type":"ContainerDied","Data":"a21fbf62be60f2dbe7ce32187648dcff59a1d1927010d5aa8fe01d526a28922c"} Apr 23 18:14:32.074112 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:32.074092 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" event={"ID":"9a3b6fb6-7d51-41b7-bd08-85493b58d372","Type":"ContainerStarted","Data":"91d51bf41d8123086c86044ee5e9e65c857670e2d5029a010b52f8cd6258e3c4"} Apr 23 18:14:32.074216 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:32.074115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" event={"ID":"9a3b6fb6-7d51-41b7-bd08-85493b58d372","Type":"ContainerStarted","Data":"2ecd99f961ada0494b9fa724877a2cc73e724a3b382d0095be0a57b5068d85b0"} Apr 23 18:14:33.876497 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:33.876452 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 23 18:14:33.880816 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:33.880794 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:14:36.002983 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.002959 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:14:36.086096 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.086061 2576 generic.go:358] "Generic (PLEG): container finished" podID="2d4b9978-964d-419e-b6c2-4693a1625360" containerID="a0d8b9404b6594753b01f7067e02faf37ddb4fed49523816c347e79fad224027" exitCode=0 Apr 23 18:14:36.086285 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.086146 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" Apr 23 18:14:36.086285 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.086152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" event={"ID":"2d4b9978-964d-419e-b6c2-4693a1625360","Type":"ContainerDied","Data":"a0d8b9404b6594753b01f7067e02faf37ddb4fed49523816c347e79fad224027"} Apr 23 18:14:36.086285 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.086190 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5" event={"ID":"2d4b9978-964d-419e-b6c2-4693a1625360","Type":"ContainerDied","Data":"a4b4ec3c99ff136a1115a216aa7026a4beefeeb123cb3efb2a9c4361e717d3ba"} Apr 23 18:14:36.086285 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.086212 2576 scope.go:117] "RemoveContainer" containerID="a21fbf62be60f2dbe7ce32187648dcff59a1d1927010d5aa8fe01d526a28922c" Apr 23 18:14:36.087476 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.087455 2576 generic.go:358] "Generic (PLEG): container finished" podID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerID="91d51bf41d8123086c86044ee5e9e65c857670e2d5029a010b52f8cd6258e3c4" exitCode=0 Apr 23 18:14:36.087596 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.087490 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" event={"ID":"9a3b6fb6-7d51-41b7-bd08-85493b58d372","Type":"ContainerDied","Data":"91d51bf41d8123086c86044ee5e9e65c857670e2d5029a010b52f8cd6258e3c4"} Apr 23 18:14:36.094410 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.094389 2576 scope.go:117] "RemoveContainer" containerID="a0d8b9404b6594753b01f7067e02faf37ddb4fed49523816c347e79fad224027" Apr 23 18:14:36.101477 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.101457 2576 scope.go:117] "RemoveContainer" containerID="43f06ff2c2266cd8edd32b205b1ef08753c46dfaf54fe8b91f072701231d8a60" Apr 23 18:14:36.111546 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.111525 2576 scope.go:117] "RemoveContainer" containerID="a21fbf62be60f2dbe7ce32187648dcff59a1d1927010d5aa8fe01d526a28922c" Apr 23 18:14:36.111837 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:14:36.111811 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a21fbf62be60f2dbe7ce32187648dcff59a1d1927010d5aa8fe01d526a28922c\": container with ID starting with a21fbf62be60f2dbe7ce32187648dcff59a1d1927010d5aa8fe01d526a28922c not found: ID does not exist" containerID="a21fbf62be60f2dbe7ce32187648dcff59a1d1927010d5aa8fe01d526a28922c" Apr 23 18:14:36.111938 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.111843 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a21fbf62be60f2dbe7ce32187648dcff59a1d1927010d5aa8fe01d526a28922c"} err="failed to get container status \"a21fbf62be60f2dbe7ce32187648dcff59a1d1927010d5aa8fe01d526a28922c\": rpc error: code = NotFound desc = could not find container \"a21fbf62be60f2dbe7ce32187648dcff59a1d1927010d5aa8fe01d526a28922c\": container with ID starting with a21fbf62be60f2dbe7ce32187648dcff59a1d1927010d5aa8fe01d526a28922c not found: ID does not exist" Apr 23 18:14:36.111938 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.111863 2576 scope.go:117] "RemoveContainer" containerID="a0d8b9404b6594753b01f7067e02faf37ddb4fed49523816c347e79fad224027" Apr 23 18:14:36.112090 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:14:36.112074 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0d8b9404b6594753b01f7067e02faf37ddb4fed49523816c347e79fad224027\": container with ID starting with a0d8b9404b6594753b01f7067e02faf37ddb4fed49523816c347e79fad224027 not found: ID does not exist" containerID="a0d8b9404b6594753b01f7067e02faf37ddb4fed49523816c347e79fad224027" Apr 23 18:14:36.112139 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.112095 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0d8b9404b6594753b01f7067e02faf37ddb4fed49523816c347e79fad224027"} err="failed to get container status \"a0d8b9404b6594753b01f7067e02faf37ddb4fed49523816c347e79fad224027\": rpc error: code = NotFound desc = could not find container \"a0d8b9404b6594753b01f7067e02faf37ddb4fed49523816c347e79fad224027\": container with ID starting with a0d8b9404b6594753b01f7067e02faf37ddb4fed49523816c347e79fad224027 not found: ID does not exist" Apr 23 18:14:36.112139 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.112109 2576 scope.go:117] "RemoveContainer" containerID="43f06ff2c2266cd8edd32b205b1ef08753c46dfaf54fe8b91f072701231d8a60" Apr 23 18:14:36.112317 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:14:36.112298 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43f06ff2c2266cd8edd32b205b1ef08753c46dfaf54fe8b91f072701231d8a60\": container with ID starting with 43f06ff2c2266cd8edd32b205b1ef08753c46dfaf54fe8b91f072701231d8a60 not found: ID does not exist" containerID="43f06ff2c2266cd8edd32b205b1ef08753c46dfaf54fe8b91f072701231d8a60" Apr 23 18:14:36.112377 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.112325 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43f06ff2c2266cd8edd32b205b1ef08753c46dfaf54fe8b91f072701231d8a60"} err="failed to get container status \"43f06ff2c2266cd8edd32b205b1ef08753c46dfaf54fe8b91f072701231d8a60\": rpc error: code = NotFound desc = could not find container \"43f06ff2c2266cd8edd32b205b1ef08753c46dfaf54fe8b91f072701231d8a60\": container with ID starting with 43f06ff2c2266cd8edd32b205b1ef08753c46dfaf54fe8b91f072701231d8a60 not found: ID does not exist" Apr 23 18:14:36.168048 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.168026 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d4b9978-964d-419e-b6c2-4693a1625360-kserve-provision-location\") pod \"2d4b9978-964d-419e-b6c2-4693a1625360\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " Apr 23 18:14:36.168162 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.168099 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d4b9978-964d-419e-b6c2-4693a1625360-proxy-tls\") pod \"2d4b9978-964d-419e-b6c2-4693a1625360\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " Apr 23 18:14:36.168162 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.168147 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62cn9\" (UniqueName: \"kubernetes.io/projected/2d4b9978-964d-419e-b6c2-4693a1625360-kube-api-access-62cn9\") pod \"2d4b9978-964d-419e-b6c2-4693a1625360\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " Apr 23 18:14:36.168371 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.168350 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-0f5e3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d4b9978-964d-419e-b6c2-4693a1625360-raw-sklearn-0f5e3-kube-rbac-proxy-sar-config\") pod \"2d4b9978-964d-419e-b6c2-4693a1625360\" (UID: \"2d4b9978-964d-419e-b6c2-4693a1625360\") " Apr 23 18:14:36.168457 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.168361 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d4b9978-964d-419e-b6c2-4693a1625360-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2d4b9978-964d-419e-b6c2-4693a1625360" (UID: "2d4b9978-964d-419e-b6c2-4693a1625360"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:14:36.168611 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.168590 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d4b9978-964d-419e-b6c2-4693a1625360-kserve-provision-location\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:14:36.168714 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.168689 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d4b9978-964d-419e-b6c2-4693a1625360-raw-sklearn-0f5e3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-0f5e3-kube-rbac-proxy-sar-config") pod "2d4b9978-964d-419e-b6c2-4693a1625360" (UID: "2d4b9978-964d-419e-b6c2-4693a1625360"). InnerVolumeSpecName "raw-sklearn-0f5e3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:14:36.170357 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.170333 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d4b9978-964d-419e-b6c2-4693a1625360-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2d4b9978-964d-419e-b6c2-4693a1625360" (UID: "2d4b9978-964d-419e-b6c2-4693a1625360"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:14:36.170452 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.170339 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d4b9978-964d-419e-b6c2-4693a1625360-kube-api-access-62cn9" (OuterVolumeSpecName: "kube-api-access-62cn9") pod "2d4b9978-964d-419e-b6c2-4693a1625360" (UID: "2d4b9978-964d-419e-b6c2-4693a1625360"). InnerVolumeSpecName "kube-api-access-62cn9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:14:36.269417 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.269377 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d4b9978-964d-419e-b6c2-4693a1625360-proxy-tls\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:14:36.269417 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.269410 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-62cn9\" (UniqueName: \"kubernetes.io/projected/2d4b9978-964d-419e-b6c2-4693a1625360-kube-api-access-62cn9\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:14:36.269417 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.269420 2576 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-0f5e3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d4b9978-964d-419e-b6c2-4693a1625360-raw-sklearn-0f5e3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:14:36.409136 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.409104 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5"] Apr 23 18:14:36.413713 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:36.413685 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-0f5e3-predictor-675fcf7fdc-zq4l5"] Apr 23 18:14:37.092632 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:37.092542 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" event={"ID":"9a3b6fb6-7d51-41b7-bd08-85493b58d372","Type":"ContainerStarted","Data":"5c47b66f54db744f0e06fb25e1dc9a49ad6c23fcedd81d23a556bc9ff0757aa9"} Apr 23 18:14:37.092632 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:37.092580 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" event={"ID":"9a3b6fb6-7d51-41b7-bd08-85493b58d372","Type":"ContainerStarted","Data":"8eaced1f6c25c500395b8e269ecf607602645c3d760d75f6a77734490c86d446"} Apr 23 18:14:37.093158 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:37.092871 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:37.113639 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:37.113594 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" podStartSLOduration=6.113579996 podStartE2EDuration="6.113579996s" podCreationTimestamp="2026-04-23 18:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:14:37.112623601 +0000 UTC m=+1347.348070341" watchObservedRunningTime="2026-04-23 18:14:37.113579996 +0000 UTC m=+1347.349026698" Apr 23 18:14:38.095418 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:38.095386 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:38.096525 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:38.096495 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:14:38.394127 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:38.394094 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" path="/var/lib/kubelet/pods/2d4b9978-964d-419e-b6c2-4693a1625360/volumes" Apr 23 18:14:39.098218 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:39.098182 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:14:44.102272 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:44.102240 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:14:44.102845 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:44.102820 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:14:54.102833 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:14:54.102793 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:15:04.103068 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:04.103026 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:15:14.103783 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:14.103696 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:15:24.102780 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:24.102717 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:15:34.103120 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:34.103080 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:15:44.104191 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:44.104158 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:15:51.627905 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:51.627866 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm"] Apr 23 18:15:51.628443 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:51.628410 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kserve-container" containerID="cri-o://8eaced1f6c25c500395b8e269ecf607602645c3d760d75f6a77734490c86d446" gracePeriod=30 Apr 23 18:15:51.628718 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:51.628687 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kube-rbac-proxy" containerID="cri-o://5c47b66f54db744f0e06fb25e1dc9a49ad6c23fcedd81d23a556bc9ff0757aa9" gracePeriod=30 Apr 23 18:15:52.300557 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:52.300521 2576 generic.go:358] "Generic (PLEG): container finished" podID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerID="5c47b66f54db744f0e06fb25e1dc9a49ad6c23fcedd81d23a556bc9ff0757aa9" exitCode=2 Apr 23 18:15:52.300756 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:52.300604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" event={"ID":"9a3b6fb6-7d51-41b7-bd08-85493b58d372","Type":"ContainerDied","Data":"5c47b66f54db744f0e06fb25e1dc9a49ad6c23fcedd81d23a556bc9ff0757aa9"} Apr 23 18:15:54.099153 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:54.099107 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 23 18:15:54.103609 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:54.103574 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:15:55.870542 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:55.870519 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:15:55.994422 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:55.994319 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t62dz\" (UniqueName: \"kubernetes.io/projected/9a3b6fb6-7d51-41b7-bd08-85493b58d372-kube-api-access-t62dz\") pod \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " Apr 23 18:15:55.994422 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:55.994392 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-runtime-5ccec-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a3b6fb6-7d51-41b7-bd08-85493b58d372-raw-sklearn-runtime-5ccec-kube-rbac-proxy-sar-config\") pod \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " Apr 23 18:15:55.994664 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:55.994434 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a3b6fb6-7d51-41b7-bd08-85493b58d372-proxy-tls\") pod \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " Apr 23 18:15:55.994664 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:55.994474 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a3b6fb6-7d51-41b7-bd08-85493b58d372-kserve-provision-location\") pod \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\" (UID: \"9a3b6fb6-7d51-41b7-bd08-85493b58d372\") " Apr 23 18:15:55.994831 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:55.994802 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a3b6fb6-7d51-41b7-bd08-85493b58d372-raw-sklearn-runtime-5ccec-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-runtime-5ccec-kube-rbac-proxy-sar-config") pod "9a3b6fb6-7d51-41b7-bd08-85493b58d372" (UID: "9a3b6fb6-7d51-41b7-bd08-85493b58d372"). InnerVolumeSpecName "raw-sklearn-runtime-5ccec-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:15:55.994887 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:55.994803 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a3b6fb6-7d51-41b7-bd08-85493b58d372-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9a3b6fb6-7d51-41b7-bd08-85493b58d372" (UID: "9a3b6fb6-7d51-41b7-bd08-85493b58d372"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:15:55.996545 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:55.996510 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a3b6fb6-7d51-41b7-bd08-85493b58d372-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9a3b6fb6-7d51-41b7-bd08-85493b58d372" (UID: "9a3b6fb6-7d51-41b7-bd08-85493b58d372"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:15:55.996545 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:55.996520 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a3b6fb6-7d51-41b7-bd08-85493b58d372-kube-api-access-t62dz" (OuterVolumeSpecName: "kube-api-access-t62dz") pod "9a3b6fb6-7d51-41b7-bd08-85493b58d372" (UID: "9a3b6fb6-7d51-41b7-bd08-85493b58d372"). InnerVolumeSpecName "kube-api-access-t62dz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:15:56.095090 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.095044 2576 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-runtime-5ccec-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a3b6fb6-7d51-41b7-bd08-85493b58d372-raw-sklearn-runtime-5ccec-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:15:56.095090 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.095086 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a3b6fb6-7d51-41b7-bd08-85493b58d372-proxy-tls\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:15:56.095090 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.095099 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a3b6fb6-7d51-41b7-bd08-85493b58d372-kserve-provision-location\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:15:56.095090 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.095109 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t62dz\" (UniqueName: \"kubernetes.io/projected/9a3b6fb6-7d51-41b7-bd08-85493b58d372-kube-api-access-t62dz\") on node \"ip-10-0-132-102.ec2.internal\" DevicePath \"\"" Apr 23 18:15:56.312528 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.312434 2576 generic.go:358] "Generic (PLEG): container finished" podID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerID="8eaced1f6c25c500395b8e269ecf607602645c3d760d75f6a77734490c86d446" exitCode=0 Apr 23 18:15:56.312528 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.312513 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" Apr 23 18:15:56.312721 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.312524 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" event={"ID":"9a3b6fb6-7d51-41b7-bd08-85493b58d372","Type":"ContainerDied","Data":"8eaced1f6c25c500395b8e269ecf607602645c3d760d75f6a77734490c86d446"} Apr 23 18:15:56.312721 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.312564 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm" event={"ID":"9a3b6fb6-7d51-41b7-bd08-85493b58d372","Type":"ContainerDied","Data":"2ecd99f961ada0494b9fa724877a2cc73e724a3b382d0095be0a57b5068d85b0"} Apr 23 18:15:56.312721 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.312580 2576 scope.go:117] "RemoveContainer" containerID="5c47b66f54db744f0e06fb25e1dc9a49ad6c23fcedd81d23a556bc9ff0757aa9" Apr 23 18:15:56.320508 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.320483 2576 scope.go:117] "RemoveContainer" containerID="8eaced1f6c25c500395b8e269ecf607602645c3d760d75f6a77734490c86d446" Apr 23 18:15:56.327379 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.327362 2576 scope.go:117] "RemoveContainer" containerID="91d51bf41d8123086c86044ee5e9e65c857670e2d5029a010b52f8cd6258e3c4" Apr 23 18:15:56.332574 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.332552 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm"] Apr 23 18:15:56.334713 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.334692 2576 scope.go:117] "RemoveContainer" containerID="5c47b66f54db744f0e06fb25e1dc9a49ad6c23fcedd81d23a556bc9ff0757aa9" Apr 23 18:15:56.335093 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:15:56.335050 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c47b66f54db744f0e06fb25e1dc9a49ad6c23fcedd81d23a556bc9ff0757aa9\": container with ID starting with 5c47b66f54db744f0e06fb25e1dc9a49ad6c23fcedd81d23a556bc9ff0757aa9 not found: ID does not exist" containerID="5c47b66f54db744f0e06fb25e1dc9a49ad6c23fcedd81d23a556bc9ff0757aa9" Apr 23 18:15:56.335093 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.335080 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c47b66f54db744f0e06fb25e1dc9a49ad6c23fcedd81d23a556bc9ff0757aa9"} err="failed to get container status \"5c47b66f54db744f0e06fb25e1dc9a49ad6c23fcedd81d23a556bc9ff0757aa9\": rpc error: code = NotFound desc = could not find container \"5c47b66f54db744f0e06fb25e1dc9a49ad6c23fcedd81d23a556bc9ff0757aa9\": container with ID starting with 5c47b66f54db744f0e06fb25e1dc9a49ad6c23fcedd81d23a556bc9ff0757aa9 not found: ID does not exist" Apr 23 18:15:56.335277 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.335098 2576 scope.go:117] "RemoveContainer" containerID="8eaced1f6c25c500395b8e269ecf607602645c3d760d75f6a77734490c86d446" Apr 23 18:15:56.335650 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:15:56.335627 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eaced1f6c25c500395b8e269ecf607602645c3d760d75f6a77734490c86d446\": container with ID starting with 8eaced1f6c25c500395b8e269ecf607602645c3d760d75f6a77734490c86d446 not found: ID does not exist" containerID="8eaced1f6c25c500395b8e269ecf607602645c3d760d75f6a77734490c86d446" Apr 23 18:15:56.335732 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.335659 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eaced1f6c25c500395b8e269ecf607602645c3d760d75f6a77734490c86d446"} err="failed to get container status \"8eaced1f6c25c500395b8e269ecf607602645c3d760d75f6a77734490c86d446\": rpc error: code = NotFound desc = could not find container \"8eaced1f6c25c500395b8e269ecf607602645c3d760d75f6a77734490c86d446\": container with ID starting with 8eaced1f6c25c500395b8e269ecf607602645c3d760d75f6a77734490c86d446 not found: ID does not exist" Apr 23 18:15:56.335732 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.335682 2576 scope.go:117] "RemoveContainer" containerID="91d51bf41d8123086c86044ee5e9e65c857670e2d5029a010b52f8cd6258e3c4" Apr 23 18:15:56.335983 ip-10-0-132-102 kubenswrapper[2576]: E0423 18:15:56.335960 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d51bf41d8123086c86044ee5e9e65c857670e2d5029a010b52f8cd6258e3c4\": container with ID starting with 91d51bf41d8123086c86044ee5e9e65c857670e2d5029a010b52f8cd6258e3c4 not found: ID does not exist" containerID="91d51bf41d8123086c86044ee5e9e65c857670e2d5029a010b52f8cd6258e3c4" Apr 23 18:15:56.336035 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.335992 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d51bf41d8123086c86044ee5e9e65c857670e2d5029a010b52f8cd6258e3c4"} err="failed to get container status \"91d51bf41d8123086c86044ee5e9e65c857670e2d5029a010b52f8cd6258e3c4\": rpc error: code = NotFound desc = could not find container \"91d51bf41d8123086c86044ee5e9e65c857670e2d5029a010b52f8cd6258e3c4\": container with ID starting with 91d51bf41d8123086c86044ee5e9e65c857670e2d5029a010b52f8cd6258e3c4 not found: ID does not exist" Apr 23 18:15:56.336646 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.336627 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-5ccec-predictor-6dbd6f74c7-mpgsm"] Apr 23 18:15:56.393135 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:15:56.393099 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" path="/var/lib/kubelet/pods/9a3b6fb6-7d51-41b7-bd08-85493b58d372/volumes" Apr 23 18:16:16.767196 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767161 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-46lhs/must-gather-5q6gr"] Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767399 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kserve-container" Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767409 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kserve-container" Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767424 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kube-rbac-proxy" Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767431 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kube-rbac-proxy" Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767437 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kube-rbac-proxy" Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767443 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kube-rbac-proxy" Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767449 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kserve-container" Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767454 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kserve-container" Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767462 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="storage-initializer" Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767467 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="storage-initializer" Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767478 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="storage-initializer" Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767483 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="storage-initializer" Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767519 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kube-rbac-proxy" Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767529 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kserve-container" Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767536 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a3b6fb6-7d51-41b7-bd08-85493b58d372" containerName="kube-rbac-proxy" Apr 23 18:16:16.767560 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.767542 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d4b9978-964d-419e-b6c2-4693a1625360" containerName="kserve-container" Apr 23 18:16:16.771053 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.771037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-46lhs/must-gather-5q6gr" Apr 23 18:16:16.772952 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.772925 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-46lhs\"/\"openshift-service-ca.crt\"" Apr 23 18:16:16.773100 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.773067 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-46lhs\"/\"kube-root-ca.crt\"" Apr 23 18:16:16.773425 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.773407 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-46lhs\"/\"default-dockercfg-g7l5f\"" Apr 23 18:16:16.776721 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.776697 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-46lhs/must-gather-5q6gr"] Apr 23 18:16:16.850218 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.850174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxlbw\" (UniqueName: \"kubernetes.io/projected/10ec2e3d-919a-4053-8c18-7e8c9e515815-kube-api-access-qxlbw\") pod \"must-gather-5q6gr\" (UID: \"10ec2e3d-919a-4053-8c18-7e8c9e515815\") " pod="openshift-must-gather-46lhs/must-gather-5q6gr" Apr 23 18:16:16.850218 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.850224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/10ec2e3d-919a-4053-8c18-7e8c9e515815-must-gather-output\") pod \"must-gather-5q6gr\" (UID: \"10ec2e3d-919a-4053-8c18-7e8c9e515815\") " pod="openshift-must-gather-46lhs/must-gather-5q6gr" Apr 23 18:16:16.951010 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.950971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxlbw\" (UniqueName: \"kubernetes.io/projected/10ec2e3d-919a-4053-8c18-7e8c9e515815-kube-api-access-qxlbw\") pod \"must-gather-5q6gr\" (UID: \"10ec2e3d-919a-4053-8c18-7e8c9e515815\") " pod="openshift-must-gather-46lhs/must-gather-5q6gr" Apr 23 18:16:16.951010 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.951012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/10ec2e3d-919a-4053-8c18-7e8c9e515815-must-gather-output\") pod \"must-gather-5q6gr\" (UID: \"10ec2e3d-919a-4053-8c18-7e8c9e515815\") " pod="openshift-must-gather-46lhs/must-gather-5q6gr" Apr 23 18:16:16.951290 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.951275 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/10ec2e3d-919a-4053-8c18-7e8c9e515815-must-gather-output\") pod \"must-gather-5q6gr\" (UID: \"10ec2e3d-919a-4053-8c18-7e8c9e515815\") " pod="openshift-must-gather-46lhs/must-gather-5q6gr" Apr 23 18:16:16.958339 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:16.958307 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxlbw\" (UniqueName: \"kubernetes.io/projected/10ec2e3d-919a-4053-8c18-7e8c9e515815-kube-api-access-qxlbw\") pod \"must-gather-5q6gr\" (UID: \"10ec2e3d-919a-4053-8c18-7e8c9e515815\") " pod="openshift-must-gather-46lhs/must-gather-5q6gr" Apr 23 18:16:17.080505 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:17.080405 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-46lhs/must-gather-5q6gr" Apr 23 18:16:17.200382 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:17.200347 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-46lhs/must-gather-5q6gr"] Apr 23 18:16:17.203372 ip-10-0-132-102 kubenswrapper[2576]: W0423 18:16:17.203339 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10ec2e3d_919a_4053_8c18_7e8c9e515815.slice/crio-25e96b095b4e0d314b9972b886084bf52ef8a58d0ee59b00da42cef5b430d060 WatchSource:0}: Error finding container 25e96b095b4e0d314b9972b886084bf52ef8a58d0ee59b00da42cef5b430d060: Status 404 returned error can't find the container with id 25e96b095b4e0d314b9972b886084bf52ef8a58d0ee59b00da42cef5b430d060 Apr 23 18:16:17.367706 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:17.367674 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-46lhs/must-gather-5q6gr" event={"ID":"10ec2e3d-919a-4053-8c18-7e8c9e515815","Type":"ContainerStarted","Data":"25e96b095b4e0d314b9972b886084bf52ef8a58d0ee59b00da42cef5b430d060"} Apr 23 18:16:18.371593 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:18.371565 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-46lhs/must-gather-5q6gr" event={"ID":"10ec2e3d-919a-4053-8c18-7e8c9e515815","Type":"ContainerStarted","Data":"8b2929d36bf7a2891940a4b9ff5a736fff724e2d41c5f6b47c30f88895fe1709"} Apr 23 18:16:19.375687 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:19.375645 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-46lhs/must-gather-5q6gr" event={"ID":"10ec2e3d-919a-4053-8c18-7e8c9e515815","Type":"ContainerStarted","Data":"054ddc31834159194fe5912a02860d441075a9ca9a3fcdf8bb4f49de06b9910c"} Apr 23 18:16:19.395761 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:19.393488 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-46lhs/must-gather-5q6gr" podStartSLOduration=2.344341114 podStartE2EDuration="3.39346992s" podCreationTimestamp="2026-04-23 18:16:16 +0000 UTC" firstStartedPulling="2026-04-23 18:16:17.20523547 +0000 UTC m=+1447.440682151" lastFinishedPulling="2026-04-23 18:16:18.25436426 +0000 UTC m=+1448.489810957" observedRunningTime="2026-04-23 18:16:19.39028725 +0000 UTC m=+1449.625733954" watchObservedRunningTime="2026-04-23 18:16:19.39346992 +0000 UTC m=+1449.628916623" Apr 23 18:16:19.811649 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:19.811568 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-q7mhh_c10ccf97-5e76-4972-b775-25d5b2e5a887/global-pull-secret-syncer/0.log" Apr 23 18:16:19.895729 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:19.895696 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-bkrt6_712ef82b-3fe4-488d-9956-2e0264016fa7/konnectivity-agent/0.log" Apr 23 18:16:19.971081 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:19.971043 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-102.ec2.internal_13a2aab92beaa8cd38c68b02321633e1/haproxy/0.log" Apr 23 18:16:23.490895 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:23.490865 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-594fb98f6c-rmldp_a750ac45-1a8d-4704-9fcb-4701164f2bd7/metrics-server/0.log" Apr 23 18:16:23.518094 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:23.518063 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-ppx75_df7cb1b7-966f-446d-8632-851efad07ab1/monitoring-plugin/0.log" Apr 23 18:16:23.774091 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:23.774016 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cpw42_ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9/node-exporter/0.log" Apr 23 18:16:23.796482 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:23.796453 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cpw42_ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9/kube-rbac-proxy/0.log" Apr 23 18:16:23.826227 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:23.826202 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cpw42_ecf8cb7c-9030-427c-bcf5-3a814bf8d6d9/init-textfile/0.log" Apr 23 18:16:26.284702 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.284674 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56869659c-xlgm4_a91a7fa7-a54b-4022-b381-1f1f05e156b0/console/0.log" Apr 23 18:16:26.314602 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.314579 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-8dtjc_2028d82d-64c8-4897-a6c2-1adb482b3e8d/download-server/0.log" Apr 23 18:16:26.608689 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.608601 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm"] Apr 23 18:16:26.614265 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.614223 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.619407 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.619378 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm"] Apr 23 18:16:26.733623 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.733588 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/373d78da-238e-4288-afff-4c0086eb7ade-sys\") pod \"perf-node-gather-daemonset-t5ldm\" (UID: \"373d78da-238e-4288-afff-4c0086eb7ade\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.733811 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.733631 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q6tv\" (UniqueName: \"kubernetes.io/projected/373d78da-238e-4288-afff-4c0086eb7ade-kube-api-access-9q6tv\") pod \"perf-node-gather-daemonset-t5ldm\" (UID: \"373d78da-238e-4288-afff-4c0086eb7ade\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.733811 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.733698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/373d78da-238e-4288-afff-4c0086eb7ade-proc\") pod \"perf-node-gather-daemonset-t5ldm\" (UID: \"373d78da-238e-4288-afff-4c0086eb7ade\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.733953 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.733803 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/373d78da-238e-4288-afff-4c0086eb7ade-lib-modules\") pod \"perf-node-gather-daemonset-t5ldm\" (UID: \"373d78da-238e-4288-afff-4c0086eb7ade\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.733953 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.733841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/373d78da-238e-4288-afff-4c0086eb7ade-podres\") pod \"perf-node-gather-daemonset-t5ldm\" (UID: \"373d78da-238e-4288-afff-4c0086eb7ade\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.834401 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.834361 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/373d78da-238e-4288-afff-4c0086eb7ade-proc\") pod \"perf-node-gather-daemonset-t5ldm\" (UID: \"373d78da-238e-4288-afff-4c0086eb7ade\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.834401 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.834409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/373d78da-238e-4288-afff-4c0086eb7ade-lib-modules\") pod \"perf-node-gather-daemonset-t5ldm\" (UID: \"373d78da-238e-4288-afff-4c0086eb7ade\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.834656 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.834433 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/373d78da-238e-4288-afff-4c0086eb7ade-podres\") pod \"perf-node-gather-daemonset-t5ldm\" (UID: \"373d78da-238e-4288-afff-4c0086eb7ade\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.834656 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.834461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/373d78da-238e-4288-afff-4c0086eb7ade-sys\") pod \"perf-node-gather-daemonset-t5ldm\" (UID: \"373d78da-238e-4288-afff-4c0086eb7ade\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.834656 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.834478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9q6tv\" (UniqueName: \"kubernetes.io/projected/373d78da-238e-4288-afff-4c0086eb7ade-kube-api-access-9q6tv\") pod \"perf-node-gather-daemonset-t5ldm\" (UID: \"373d78da-238e-4288-afff-4c0086eb7ade\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.834656 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.834487 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/373d78da-238e-4288-afff-4c0086eb7ade-proc\") pod \"perf-node-gather-daemonset-t5ldm\" (UID: \"373d78da-238e-4288-afff-4c0086eb7ade\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.834656 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.834571 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/373d78da-238e-4288-afff-4c0086eb7ade-lib-modules\") pod \"perf-node-gather-daemonset-t5ldm\" (UID: \"373d78da-238e-4288-afff-4c0086eb7ade\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.834656 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.834608 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/373d78da-238e-4288-afff-4c0086eb7ade-podres\") pod \"perf-node-gather-daemonset-t5ldm\" (UID: \"373d78da-238e-4288-afff-4c0086eb7ade\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.834656 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.834624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/373d78da-238e-4288-afff-4c0086eb7ade-sys\") pod \"perf-node-gather-daemonset-t5ldm\" (UID: \"373d78da-238e-4288-afff-4c0086eb7ade\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.844983 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.844949 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q6tv\" (UniqueName: \"kubernetes.io/projected/373d78da-238e-4288-afff-4c0086eb7ade-kube-api-access-9q6tv\") pod \"perf-node-gather-daemonset-t5ldm\" (UID: \"373d78da-238e-4288-afff-4c0086eb7ade\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:26.928052 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:26.928018 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:27.064123 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:27.064057 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm"] Apr 23 18:16:27.068455 ip-10-0-132-102 kubenswrapper[2576]: W0423 18:16:27.068402 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod373d78da_238e_4288_afff_4c0086eb7ade.slice/crio-0a17393f8e8540d24cda4ed7a444b71d8c52a9ce4983dd071568b5ef41db0783 WatchSource:0}: Error finding container 0a17393f8e8540d24cda4ed7a444b71d8c52a9ce4983dd071568b5ef41db0783: Status 404 returned error can't find the container with id 0a17393f8e8540d24cda4ed7a444b71d8c52a9ce4983dd071568b5ef41db0783 Apr 23 18:16:27.411899 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:27.411858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" event={"ID":"373d78da-238e-4288-afff-4c0086eb7ade","Type":"ContainerStarted","Data":"441beee1e6ef52ada2ff97b34e628e862e535bd8c99480212a560e0104d60c33"} Apr 23 18:16:27.412290 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:27.411906 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:27.412290 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:27.411920 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" event={"ID":"373d78da-238e-4288-afff-4c0086eb7ade","Type":"ContainerStarted","Data":"0a17393f8e8540d24cda4ed7a444b71d8c52a9ce4983dd071568b5ef41db0783"} Apr 23 18:16:27.427381 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:27.427326 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" podStartSLOduration=1.427308228 podStartE2EDuration="1.427308228s" podCreationTimestamp="2026-04-23 18:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:16:27.426977005 +0000 UTC m=+1457.662423731" watchObservedRunningTime="2026-04-23 18:16:27.427308228 +0000 UTC m=+1457.662754984" Apr 23 18:16:27.444927 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:27.444901 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lcwv7_4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9/dns/0.log" Apr 23 18:16:27.468751 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:27.468710 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lcwv7_4ec00fc8-34ea-4af6-892d-8c8dafb8d9a9/kube-rbac-proxy/0.log" Apr 23 18:16:27.643004 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:27.642975 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-n7pdd_3c4a21a3-0078-4632-bce8-ee31a63bceb2/dns-node-resolver/0.log" Apr 23 18:16:28.044757 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:28.044720 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7d6947dcbc-j7jjm_a0316b20-9d56-4972-9cdf-d2acf03e0921/registry/0.log" Apr 23 18:16:28.090871 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:28.090836 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9pnhp_4c608978-9ca3-4730-81a8-ed012e4601c4/node-ca/0.log" Apr 23 18:16:29.260971 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:29.260939 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fmtwc_bfd74eb8-918a-45f2-abb0-8342a3e4ebc4/serve-healthcheck-canary/0.log" Apr 23 18:16:29.800515 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:29.800486 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wd76j_940e0919-0fc3-4b70-81f5-5a818c8ded8c/kube-rbac-proxy/0.log" Apr 23 18:16:29.821761 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:29.821715 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wd76j_940e0919-0fc3-4b70-81f5-5a818c8ded8c/exporter/0.log" Apr 23 18:16:29.843386 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:29.843362 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wd76j_940e0919-0fc3-4b70-81f5-5a818c8ded8c/extractor/0.log" Apr 23 18:16:33.428860 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:33.427941 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-t5ldm" Apr 23 18:16:35.743648 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:35.743560 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-855ll_bcdca23a-69b4-4008-b9cd-3d1b6622c920/migrator/0.log" Apr 23 18:16:35.769120 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:35.769091 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-855ll_bcdca23a-69b4-4008-b9cd-3d1b6622c920/graceful-termination/0.log" Apr 23 18:16:37.213005 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:37.212949 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6g56n_ae56a92f-dfae-4763-b849-dca72bc2cf3d/kube-multus/0.log" Apr 23 18:16:37.220218 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:37.220192 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6g56n_ae56a92f-dfae-4763-b849-dca72bc2cf3d/kube-multus/1.log" Apr 23 18:16:37.276730 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:37.276704 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ncg5g_0c55482f-ee0e-4a40-a959-7530a690f4c2/kube-multus-additional-cni-plugins/0.log" Apr 23 18:16:37.303313 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:37.303287 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ncg5g_0c55482f-ee0e-4a40-a959-7530a690f4c2/egress-router-binary-copy/0.log" Apr 23 18:16:37.326711 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:37.326686 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ncg5g_0c55482f-ee0e-4a40-a959-7530a690f4c2/cni-plugins/0.log" Apr 23 18:16:37.350018 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:37.349985 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ncg5g_0c55482f-ee0e-4a40-a959-7530a690f4c2/bond-cni-plugin/0.log" Apr 23 18:16:37.374341 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:37.374318 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ncg5g_0c55482f-ee0e-4a40-a959-7530a690f4c2/routeoverride-cni/0.log" Apr 23 18:16:37.400274 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:37.400250 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ncg5g_0c55482f-ee0e-4a40-a959-7530a690f4c2/whereabouts-cni-bincopy/0.log" Apr 23 18:16:37.426981 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:37.426955 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ncg5g_0c55482f-ee0e-4a40-a959-7530a690f4c2/whereabouts-cni/0.log" Apr 23 18:16:37.891568 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:37.891531 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jfhpv_5baefb5e-77f1-440a-918c-82da4620b8d7/network-metrics-daemon/0.log" Apr 23 18:16:37.910963 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:37.910936 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jfhpv_5baefb5e-77f1-440a-918c-82da4620b8d7/kube-rbac-proxy/0.log" Apr 23 18:16:39.287182 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:39.287153 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tph9h_518ae3f8-909f-4ac9-932b-cf6c27fde0e0/ovn-controller/0.log" Apr 23 18:16:39.321235 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:39.321171 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tph9h_518ae3f8-909f-4ac9-932b-cf6c27fde0e0/ovn-acl-logging/0.log" Apr 23 18:16:39.359143 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:39.359116 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tph9h_518ae3f8-909f-4ac9-932b-cf6c27fde0e0/kube-rbac-proxy-node/0.log" Apr 23 18:16:39.394454 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:39.394429 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tph9h_518ae3f8-909f-4ac9-932b-cf6c27fde0e0/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 18:16:39.433890 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:39.433870 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tph9h_518ae3f8-909f-4ac9-932b-cf6c27fde0e0/northd/0.log" Apr 23 18:16:39.467272 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:39.467229 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tph9h_518ae3f8-909f-4ac9-932b-cf6c27fde0e0/nbdb/0.log" Apr 23 18:16:39.490185 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:39.490159 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tph9h_518ae3f8-909f-4ac9-932b-cf6c27fde0e0/sbdb/0.log" Apr 23 18:16:39.578302 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:39.578237 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tph9h_518ae3f8-909f-4ac9-932b-cf6c27fde0e0/ovnkube-controller/0.log" Apr 23 18:16:40.725221 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:40.725188 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-x77gx_41ba5b02-a248-4259-8ca2-8f501349c1b3/network-check-target-container/0.log" Apr 23 18:16:41.690199 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:41.690171 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-8rcgt_d34dc40a-b3d7-4330-a3aa-7c90a9055d36/iptables-alerter/0.log" Apr 23 18:16:42.640304 ip-10-0-132-102 kubenswrapper[2576]: I0423 18:16:42.640267 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-z6prg_82216d67-3ae3-4fd5-be5c-85a939836d44/tuned/0.log"