Apr 24 21:14:29.763352 ip-10-0-132-118 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:14:29.763372 ip-10-0-132-118 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:14:29.763379 ip-10-0-132-118 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:14:29.763673 ip-10-0-132-118 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:14:39.953174 ip-10-0-132-118 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:14:39.953193 ip-10-0-132-118 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot a8d38499a0d3475ea7b6943c1367d6ca -- Apr 24 21:16:51.742019 ip-10-0-132-118 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:16:52.212929 ip-10-0-132-118 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:52.212929 ip-10-0-132-118 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:16:52.212929 ip-10-0-132-118 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:52.212929 ip-10-0-132-118 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:16:52.212929 ip-10-0-132-118 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:52.216060 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.215975 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:16:52.227309 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227285 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:52.227309 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227304 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:52.227309 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227308 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:52.227309 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227311 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:52.227309 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227314 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:52.227309 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227317 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227320 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227323 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227326 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227328 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227331 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227334 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227336 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227341 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227345 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227348 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227350 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227353 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227356 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227358 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227361 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227363 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227366 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227369 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:52.227516 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227372 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227375 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227378 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227385 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227387 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227390 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227393 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227395 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227398 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227400 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227403 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227405 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227408 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227410 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227412 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227417 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227420 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227424 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227428 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:52.227988 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227431 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227434 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227437 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227440 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227442 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227445 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227447 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227450 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227453 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227455 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227458 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227461 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227463 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227466 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227469 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227472 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227474 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227477 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227479 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227482 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:52.228482 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227485 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227487 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227490 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227492 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227495 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227498 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227500 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227503 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227508 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227511 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227514 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227516 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227519 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227522 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227525 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227529 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227532 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227534 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227537 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227539 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:52.229014 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227542 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227545 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.227547 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228612 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228620 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228623 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228626 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228628 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228631 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228633 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228636 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228639 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228641 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228644 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228646 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228649 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228651 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228654 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228657 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228659 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:52.229490 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228663 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228665 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228668 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228671 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228674 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228676 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228679 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228682 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228684 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228687 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228690 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228693 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228695 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228698 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228700 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228703 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228705 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228707 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228710 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:52.229992 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228712 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228715 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228717 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228720 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228722 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228725 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228728 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228730 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228733 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228735 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228738 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228741 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228743 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228777 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228781 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228784 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228787 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228789 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228792 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228794 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:52.230464 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228797 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228800 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228805 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228808 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228811 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228814 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228818 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228821 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228824 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228826 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228829 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228832 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228835 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228838 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228841 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228843 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228846 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228848 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228851 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:52.231012 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228853 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228856 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228859 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228861 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228864 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228867 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228869 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228872 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228875 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228880 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.228882 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.228949 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.228956 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.228962 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.228966 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.228971 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.228975 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.228979 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.228984 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.228987 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.228990 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:16:52.231473 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.228993 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.228997 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229000 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229003 2569 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229006 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229009 2569 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229012 2569 flags.go:64] FLAG: --cloud-config="" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229015 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229018 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229023 2569 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229026 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229029 2569 flags.go:64] FLAG: --config-dir="" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229032 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229036 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229040 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229043 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229046 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229049 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229052 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229055 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229059 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229062 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229065 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229069 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229072 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:16:52.231996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229075 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229078 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229081 2569 flags.go:64] FLAG: --enable-server="true" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229084 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229088 2569 flags.go:64] FLAG: --event-burst="100" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229092 2569 flags.go:64] FLAG: --event-qps="50" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229095 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229098 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229100 2569 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229105 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229108 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229110 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229113 2569 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229116 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229119 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229122 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229125 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229128 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229131 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229134 2569 flags.go:64] FLAG: --feature-gates="" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229138 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229141 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229144 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229147 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229150 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:16:52.232590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229153 2569 flags.go:64] FLAG: --help="false" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229156 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229160 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229163 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229166 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229169 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229172 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229175 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229178 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229181 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229184 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229187 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229190 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229193 2569 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229196 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229199 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229202 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229205 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229208 2569 flags.go:64] FLAG: --lock-file="" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229210 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229213 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229216 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229221 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:16:52.233253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229224 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229227 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229230 2569 flags.go:64] FLAG: --logging-format="text" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229233 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229237 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229240 2569 flags.go:64] FLAG: --manifest-url="" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229243 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229247 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229250 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229254 2569 flags.go:64] FLAG: --max-pods="110" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229257 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229260 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229263 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229266 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229269 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229273 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229276 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229283 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229286 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229289 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229292 2569 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229295 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229300 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229303 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:16:52.233841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229306 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229309 2569 flags.go:64] FLAG: --port="10250" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229312 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229315 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0005407a01d213059" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229318 2569 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229321 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229324 2569 flags.go:64] FLAG: --register-node="true" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229327 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229329 2569 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229333 2569 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229336 2569 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229339 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229342 2569 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229346 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229349 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229352 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229355 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229358 2569 flags.go:64] FLAG: --runonce="false" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229361 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229364 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229367 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229370 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229373 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229376 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229379 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229382 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:16:52.234497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229385 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229387 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229390 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229393 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229397 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229399 2569 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229402 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229407 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229410 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229413 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229418 2569 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229420 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229423 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229426 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229429 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229432 2569 flags.go:64] FLAG: --v="2" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229436 2569 flags.go:64] FLAG: --version="false" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229442 2569 flags.go:64] FLAG: --vmodule="" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229446 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229449 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229536 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229539 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229543 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229545 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:52.235183 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229548 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229556 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229558 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229561 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229563 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229566 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229569 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229571 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229574 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229577 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229581 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229584 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229588 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229591 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229594 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229596 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229599 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229601 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229604 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:52.235770 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229606 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229609 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229611 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229614 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229617 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229619 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229625 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229628 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229630 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229633 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229636 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229638 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229641 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229643 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229647 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229650 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229652 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229655 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229657 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229660 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229662 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:52.236290 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229665 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229668 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229670 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229673 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229675 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229678 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229681 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229684 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229686 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229688 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229691 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229694 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229696 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229699 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229702 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229704 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229707 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229710 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229713 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229715 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:52.236834 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229718 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229721 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229723 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229726 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229728 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229732 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229735 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229737 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229740 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229742 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229745 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229749 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229765 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229769 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229772 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229775 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229778 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229780 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229783 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:52.237317 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229786 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229788 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.229791 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.229796 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.237114 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.237132 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237181 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237187 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237190 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237194 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237197 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237200 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237203 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237206 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237209 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237212 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:52.237801 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237215 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237218 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237221 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237223 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237226 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237229 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237232 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237234 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237237 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237239 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237242 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237244 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237247 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237249 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237252 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237256 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237259 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237262 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237266 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237270 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:52.238196 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237273 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237276 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237279 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237282 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237284 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237287 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237290 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237292 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237296 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237298 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237301 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237304 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237306 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237309 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237312 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237314 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237317 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237320 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237324 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:52.238668 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237328 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237331 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237334 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237337 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237340 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237343 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237346 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237349 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237352 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237355 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237357 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237360 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237362 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237365 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237368 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237370 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237373 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237376 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237378 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237381 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:52.239176 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237384 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237386 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237389 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237392 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237394 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237397 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237399 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237402 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237404 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237407 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237409 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237412 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237415 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237417 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237420 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237422 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:52.239653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237425 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:52.240081 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.237430 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:52.240081 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237528 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:52.240081 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237532 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:52.240081 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237535 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:52.240081 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237538 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:52.240081 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237542 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:52.240081 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237545 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:52.240081 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237547 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:52.240081 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237550 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:52.240081 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237553 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:52.240081 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237555 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:52.240081 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237558 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:52.240081 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237560 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:52.240081 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237563 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:52.240081 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237566 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237568 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237571 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237573 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237576 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237578 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237581 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237583 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237586 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237589 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237591 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237594 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237597 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237599 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237602 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237604 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237607 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237610 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237612 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:52.240442 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237615 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237617 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237620 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237622 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237625 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237628 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237630 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237633 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237635 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237638 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237640 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237643 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237646 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237649 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237651 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237654 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237656 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237659 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237661 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237664 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:52.240997 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237666 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237668 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237672 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237675 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237677 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237680 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237682 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237685 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237687 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237690 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237692 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237695 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237697 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237700 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237702 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237705 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237707 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237710 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237713 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237715 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:52.241493 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237718 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:52.242047 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237720 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:52.242047 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237724 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:52.242047 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237730 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:52.242047 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237733 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:52.242047 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237736 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:52.242047 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237738 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:52.242047 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237741 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:52.242047 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237743 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:52.242047 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237746 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:52.242047 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237749 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:52.242047 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237751 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:52.242047 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237768 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:52.242047 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:52.237771 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:52.242047 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.237776 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:52.242047 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.238439 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:16:52.242420 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.240548 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:16:52.242420 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.241467 2569 server.go:1019] "Starting client certificate rotation" Apr 24 21:16:52.242420 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.241559 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:16:52.242420 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.241956 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:16:52.271023 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.271001 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:16:52.273474 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.273457 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:16:52.289600 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.289582 2569 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:16:52.296068 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.296051 2569 log.go:25] "Validated CRI v1 image API" Apr 24 21:16:52.297381 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.297366 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:16:52.299381 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.299365 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:16:52.302883 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.302863 2569 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 bf0b0416-b4de-42a0-a844-3a8a3043eab1:/dev/nvme0n1p3 d1f460e6-fb66-4c76-84bb-a2f579742336:/dev/nvme0n1p4] Apr 24 21:16:52.302952 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.302881 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:16:52.308503 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.308399 2569 manager.go:217] Machine: {Timestamp:2026-04-24 21:16:52.306529925 +0000 UTC m=+0.440928537 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3068607 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25f36690618abc8f0c7bf85a67e871 SystemUUID:ec25f366-9061-8abc-8f0c-7bf85a67e871 BootID:a8d38499-a0d3-475e-a7b6-943c1367d6ca Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1a:68:b0:c4:93 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1a:68:b0:c4:93 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:3e:8f:1e:fb:b9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:16:52.308503 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.308498 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:16:52.308603 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.308568 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:16:52.311049 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.311023 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:16:52.311182 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.311051 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-118.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:16:52.311224 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.311191 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:16:52.311224 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.311199 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:16:52.311224 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.311211 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:16:52.311224 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.311222 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:16:52.312120 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.312110 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:16:52.312214 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.312206 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:16:52.314332 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.314323 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:16:52.314368 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.314339 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:16:52.314368 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.314350 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:16:52.314368 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.314363 2569 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:16:52.314449 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.314371 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:16:52.315383 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.315359 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:16:52.315476 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.315389 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:16:52.318206 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.318192 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:16:52.320273 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.320257 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:16:52.321531 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.321519 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:16:52.321589 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.321536 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:16:52.321589 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.321549 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:16:52.321589 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.321555 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:16:52.321589 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.321561 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:16:52.321589 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.321567 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:16:52.321589 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.321572 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:16:52.321589 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.321578 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:16:52.321589 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.321584 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:16:52.321589 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.321590 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:16:52.321843 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.321602 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:16:52.321843 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.321612 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:16:52.322590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.322580 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:16:52.322590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.322590 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:16:52.325452 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.325423 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:16:52.325452 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.325439 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-118.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:16:52.325452 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.325428 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-118.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:16:52.325945 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.325932 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:16:52.326004 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.325972 2569 server.go:1295] "Started kubelet" Apr 24 21:16:52.326053 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.326030 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:16:52.326814 ip-10-0-132-118 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:16:52.326934 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.326800 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:16:52.330081 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.326874 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:16:52.331502 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.331472 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:16:52.332525 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.332497 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:16:52.335279 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.334422 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-118.ec2.internal.18a96793c45233d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-118.ec2.internal,UID:ip-10-0-132-118.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-118.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.325946326 +0000 UTC m=+0.460344942,LastTimestamp:2026-04-24 21:16:52.325946326 +0000 UTC m=+0.460344942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-118.ec2.internal,}" Apr 24 21:16:52.335860 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.335843 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mdvl9" Apr 24 21:16:52.336733 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.336715 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:16:52.337598 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.337152 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:16:52.338210 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.338190 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:16:52.338325 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.338314 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:16:52.338691 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.338579 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:16:52.338858 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.338846 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:16:52.338948 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.338929 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:16:52.339033 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.338948 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:52.340650 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.340635 2569 factory.go:55] Registering systemd factory Apr 24 21:16:52.340748 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.340659 2569 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:16:52.341237 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.341219 2569 factory.go:153] Registering CRI-O factory Apr 24 21:16:52.341237 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.341237 2569 factory.go:223] Registration of the crio container factory successfully Apr 24 21:16:52.341376 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.341291 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:16:52.341376 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.341315 2569 factory.go:103] Registering Raw factory Apr 24 21:16:52.341376 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.341328 2569 manager.go:1196] Started watching for new ooms in manager Apr 24 21:16:52.341861 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.341836 2569 manager.go:319] Starting recovery of all containers Apr 24 21:16:52.347866 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.347821 2569 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-118.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:16:52.348586 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.348539 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:16:52.350203 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.350047 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mdvl9" Apr 24 21:16:52.351792 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.351776 2569 manager.go:324] Recovery completed Apr 24 21:16:52.357494 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.357480 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:52.359743 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.359729 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:52.359850 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.359771 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:52.359850 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.359785 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:52.360244 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.360229 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:16:52.360244 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.360244 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:16:52.360365 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.360283 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:16:52.362656 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.362639 2569 policy_none.go:49] "None policy: Start" Apr 24 21:16:52.362656 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.362654 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:16:52.362814 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.362664 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:16:52.363860 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.363774 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-118.ec2.internal.18a96793c655e381 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-118.ec2.internal,UID:ip-10-0-132-118.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-132-118.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-132-118.ec2.internal,},FirstTimestamp:2026-04-24 21:16:52.359742337 +0000 UTC m=+0.494140948,LastTimestamp:2026-04-24 21:16:52.359742337 +0000 UTC m=+0.494140948,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-118.ec2.internal,}" Apr 24 21:16:52.399302 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.399288 2569 manager.go:341] "Starting Device Plugin manager" Apr 24 21:16:52.409663 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.399315 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:16:52.409663 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.399324 2569 server.go:85] "Starting device plugin registration server" Apr 24 21:16:52.409663 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.399533 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:16:52.409663 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.399545 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:16:52.409663 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.399677 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:16:52.409663 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.399746 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:16:52.409663 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.399767 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:16:52.409663 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.400202 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:16:52.409663 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.400228 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:52.430414 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.430382 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:16:52.431505 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.431485 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:16:52.431573 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.431509 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:16:52.431573 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.431532 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:16:52.431573 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.431538 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:16:52.431573 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.431570 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:16:52.434346 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.434329 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:52.500386 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.500348 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:52.501146 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.501131 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:52.501221 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.501157 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:52.501221 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.501168 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:52.501221 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.501195 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.506898 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.506885 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.506955 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.506903 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-118.ec2.internal\": node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:52.532408 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.532386 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-118.ec2.internal"] Apr 24 21:16:52.532493 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.532438 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:52.533061 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.533047 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:52.533865 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.533852 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:52.533943 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.533882 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:52.533943 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.533902 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:52.535044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.535030 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:52.535188 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.535175 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.535233 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.535203 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:52.535715 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.535697 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:52.535834 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.535726 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:52.535834 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.535739 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:52.535834 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.535698 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:52.535834 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.535781 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:52.535834 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.535792 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:52.536895 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.536880 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.536972 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.536906 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:52.537534 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.537520 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:52.537597 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.537544 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:52.537597 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.537554 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:52.552277 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.552258 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-118.ec2.internal\" not found" node="ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.556359 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.556344 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-118.ec2.internal\" not found" node="ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.633467 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.633451 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:52.640467 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.640450 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/22cd6bac03b476ac7b2ad7c86b5343a4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal\" (UID: \"22cd6bac03b476ac7b2ad7c86b5343a4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.640516 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.640473 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22cd6bac03b476ac7b2ad7c86b5343a4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal\" (UID: \"22cd6bac03b476ac7b2ad7c86b5343a4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.640516 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.640493 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0eda8f5a4a0c114d01bb72ca7f3afc81-config\") pod \"kube-apiserver-proxy-ip-10-0-132-118.ec2.internal\" (UID: \"0eda8f5a4a0c114d01bb72ca7f3afc81\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.733816 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.733799 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:52.741227 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.741213 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/22cd6bac03b476ac7b2ad7c86b5343a4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal\" (UID: \"22cd6bac03b476ac7b2ad7c86b5343a4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.741274 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.741235 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22cd6bac03b476ac7b2ad7c86b5343a4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal\" (UID: \"22cd6bac03b476ac7b2ad7c86b5343a4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.741274 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.741251 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0eda8f5a4a0c114d01bb72ca7f3afc81-config\") pod \"kube-apiserver-proxy-ip-10-0-132-118.ec2.internal\" (UID: \"0eda8f5a4a0c114d01bb72ca7f3afc81\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.741351 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.741296 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0eda8f5a4a0c114d01bb72ca7f3afc81-config\") pod \"kube-apiserver-proxy-ip-10-0-132-118.ec2.internal\" (UID: \"0eda8f5a4a0c114d01bb72ca7f3afc81\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.741351 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.741308 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/22cd6bac03b476ac7b2ad7c86b5343a4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal\" (UID: \"22cd6bac03b476ac7b2ad7c86b5343a4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.741351 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.741325 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22cd6bac03b476ac7b2ad7c86b5343a4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal\" (UID: \"22cd6bac03b476ac7b2ad7c86b5343a4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.834834 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.834786 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:52.854294 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.854280 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.858741 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:52.858718 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-118.ec2.internal" Apr 24 21:16:52.935527 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:52.935502 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:53.036062 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:53.036043 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:53.136647 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:53.136592 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:53.237170 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:53.237147 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:53.241472 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.241461 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:16:53.241592 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.241578 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:16:53.337507 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:53.337481 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:53.337507 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.337502 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:16:53.349492 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.349473 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:16:53.352127 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.352094 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:11:52 +0000 UTC" deadline="2028-02-04 11:29:12.86634644 +0000 UTC" Apr 24 21:16:53.352183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.352128 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15614h12m19.514221653s" Apr 24 21:16:53.381576 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.381559 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:53.382999 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.382983 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7wnks" Apr 24 21:16:53.393939 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.393899 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7wnks" Apr 24 21:16:53.418348 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:53.418322 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22cd6bac03b476ac7b2ad7c86b5343a4.slice/crio-929cfae65d4407df647540a2607329425bb07f87cbdb7c466d93857a8177dcde WatchSource:0}: Error finding container 929cfae65d4407df647540a2607329425bb07f87cbdb7c466d93857a8177dcde: Status 404 returned error can't find the container with id 929cfae65d4407df647540a2607329425bb07f87cbdb7c466d93857a8177dcde Apr 24 21:16:53.422734 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.422719 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:16:53.434049 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.434015 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal" event={"ID":"22cd6bac03b476ac7b2ad7c86b5343a4","Type":"ContainerStarted","Data":"929cfae65d4407df647540a2607329425bb07f87cbdb7c466d93857a8177dcde"} Apr 24 21:16:53.438272 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:53.438248 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:53.438921 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:53.438900 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eda8f5a4a0c114d01bb72ca7f3afc81.slice/crio-60d8a965ed8464b6a888ab81894fbeaa6fda6e6a688947c279f3737acd585c64 WatchSource:0}: Error finding container 60d8a965ed8464b6a888ab81894fbeaa6fda6e6a688947c279f3737acd585c64: Status 404 returned error can't find the container with id 60d8a965ed8464b6a888ab81894fbeaa6fda6e6a688947c279f3737acd585c64 Apr 24 21:16:53.538891 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:53.538861 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:53.639277 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:53.639257 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:53.709064 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.708108 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:53.739539 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:53.739516 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:53.840253 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:53.840221 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-118.ec2.internal\" not found" Apr 24 21:16:53.869942 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.869800 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:53.938276 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.938201 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal" Apr 24 21:16:53.962083 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.962053 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:16:53.962989 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:53.962919 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-118.ec2.internal" Apr 24 21:16:54.053271 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.053169 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:16:54.315290 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.315214 2569 apiserver.go:52] "Watching apiserver" Apr 24 21:16:54.321081 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.321056 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:16:54.321987 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.321961 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-9s62d","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal","openshift-network-operator/iptables-alerter-288np","openshift-ovn-kubernetes/ovnkube-node-fsxch","kube-system/kube-apiserver-proxy-ip-10-0-132-118.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc","openshift-multus/multus-additional-cni-plugins-bhz9g","openshift-multus/multus-nwzth","openshift-multus/network-metrics-daemon-zqp7l","openshift-network-diagnostics/network-check-target-m9nk2","kube-system/konnectivity-agent-lhcn8","openshift-cluster-node-tuning-operator/tuned-djgd9"] Apr 24 21:16:54.324533 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.324509 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9s62d" Apr 24 21:16:54.324774 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.324701 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-288np" Apr 24 21:16:54.325831 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.325807 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.327096 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.327078 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.328375 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.328360 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.329635 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.329618 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.330871 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.330853 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:16:54.330998 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:54.330933 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:16:54.332576 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.332213 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:16:54.332576 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.332276 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:54.332576 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:54.332276 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:16:54.332576 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.332369 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:54.332576 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.332573 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:16:54.332821 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.332590 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:16:54.332821 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.332654 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fzlbr\"" Apr 24 21:16:54.333448 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.333432 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lhcn8" Apr 24 21:16:54.334297 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.334282 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:16:54.334834 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.334816 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.339959 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.339854 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:16:54.341069 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.341035 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:16:54.341177 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.341159 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:16:54.341177 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.341170 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:16:54.341304 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.341271 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.341810 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.342414 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jrbtm\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.342588 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.342674 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.342807 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.342828 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lz4gl\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.342868 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qcgmz\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.342945 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.343084 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.343107 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.343179 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-t525h\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.343347 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.343440 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.343445 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.343559 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.343663 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:16:54.343889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.343714 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-j85kl\"" Apr 24 21:16:54.344570 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.343903 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:16:54.344570 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.343994 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:16:54.344570 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.344016 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-ft8qp\"" Apr 24 21:16:54.344570 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.343903 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:16:54.344570 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.344181 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:16:54.345375 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.345354 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-czgd4\"" Apr 24 21:16:54.349059 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349038 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl94h\" (UniqueName: \"kubernetes.io/projected/ddd581ca-fe5d-4e33-965d-ad198f8af209-kube-api-access-zl94h\") pod \"network-metrics-daemon-zqp7l\" (UID: \"ddd581ca-fe5d-4e33-965d-ad198f8af209\") " pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:16:54.349161 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349073 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-sys\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.349161 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349100 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-run-openvswitch\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.349161 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349148 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-node-log\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.349315 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349190 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-sysconfig\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.349315 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349213 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-lib-modules\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.349315 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349234 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-var-lib-kubelet\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.349315 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349258 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5cc754c0-fce1-4135-a603-509ef613e62d-host-slash\") pod \"iptables-alerter-288np\" (UID: \"5cc754c0-fce1-4135-a603-509ef613e62d\") " pod="openshift-network-operator/iptables-alerter-288np" Apr 24 21:16:54.349315 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349284 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-device-dir\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.349548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349329 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-system-cni-dir\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.349548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349375 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-systemd\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.349548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349401 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-tuned\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.349548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349426 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c591371-ce4e-4fa2-8ae0-1f25308bf023-tmp\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.349548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349452 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4rfr\" (UniqueName: \"kubernetes.io/projected/8c591371-ce4e-4fa2-8ae0-1f25308bf023-kube-api-access-g4rfr\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.349548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349471 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnqlj\" (UniqueName: \"kubernetes.io/projected/c4045dfd-b9dc-46c0-9964-9335ef05615d-kube-api-access-fnqlj\") pod \"node-ca-9s62d\" (UID: \"c4045dfd-b9dc-46c0-9964-9335ef05615d\") " pod="openshift-image-registry/node-ca-9s62d" Apr 24 21:16:54.349548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349491 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-var-lib-openvswitch\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.349548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349507 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-multus-cni-dir\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.349918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349547 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-run-k8s-cni-cncf-io\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.349918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349579 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-etc-kubernetes\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.349918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349603 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-sysctl-conf\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.349918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349624 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4045dfd-b9dc-46c0-9964-9335ef05615d-host\") pod \"node-ca-9s62d\" (UID: \"c4045dfd-b9dc-46c0-9964-9335ef05615d\") " pod="openshift-image-registry/node-ca-9s62d" Apr 24 21:16:54.349918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349646 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-log-socket\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.349918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349674 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-cni-bin\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.349918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349698 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68711042-2c0d-43ee-aac6-684c532f8d59-cnibin\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.349918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349736 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68711042-2c0d-43ee-aac6-684c532f8d59-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.349918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349804 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5cc754c0-fce1-4135-a603-509ef613e62d-iptables-alerter-script\") pod \"iptables-alerter-288np\" (UID: \"5cc754c0-fce1-4135-a603-509ef613e62d\") " pod="openshift-network-operator/iptables-alerter-288np" Apr 24 21:16:54.349918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349854 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-kubelet\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.349918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349884 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-run-ovn\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.349918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349909 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-cni-netd\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349934 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr4n4\" (UniqueName: \"kubernetes.io/projected/68711042-2c0d-43ee-aac6-684c532f8d59-kube-api-access-zr4n4\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.349969 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-registration-dir\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350009 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68711042-2c0d-43ee-aac6-684c532f8d59-cni-binary-copy\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350040 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68711042-2c0d-43ee-aac6-684c532f8d59-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350063 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7cbedbda-69db-41ea-8652-f7e83cd6b251-konnectivity-ca\") pod \"konnectivity-agent-lhcn8\" (UID: \"7cbedbda-69db-41ea-8652-f7e83cd6b251\") " pod="kube-system/konnectivity-agent-lhcn8" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350086 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-run\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350112 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-host\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350136 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1e0c938-ec29-416f-8376-93b93ff2d991-ovn-node-metrics-cert\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350159 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-kubernetes\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350200 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-socket-dir\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350225 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5zsk\" (UniqueName: \"kubernetes.io/projected/84bd406c-811a-4762-8450-a7fabd1f8bad-kube-api-access-r5zsk\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350248 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-multus-conf-dir\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350268 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-sysctl-d\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350294 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c4045dfd-b9dc-46c0-9964-9335ef05615d-serviceca\") pod \"node-ca-9s62d\" (UID: \"c4045dfd-b9dc-46c0-9964-9335ef05615d\") " pod="openshift-image-registry/node-ca-9s62d" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350332 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68711042-2c0d-43ee-aac6-684c532f8d59-system-cni-dir\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.350384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350358 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-multus-socket-dir-parent\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350389 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/10fe30c1-99bf-4857-b080-8aafa2ee3910-multus-daemon-config\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350415 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-run-systemd\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350437 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350455 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-etc-selinux\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350479 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-cnibin\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350502 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-var-lib-cni-bin\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350524 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-hostroot\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350547 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmcxk\" (UniqueName: \"kubernetes.io/projected/10fe30c1-99bf-4857-b080-8aafa2ee3910-kube-api-access-tmcxk\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350564 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-run-netns\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350589 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1e0c938-ec29-416f-8376-93b93ff2d991-ovnkube-script-lib\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350611 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350634 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-var-lib-cni-multus\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350659 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-run-multus-certs\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350676 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ptzf\" (UniqueName: \"kubernetes.io/projected/e1e0c938-ec29-416f-8376-93b93ff2d991-kube-api-access-2ptzf\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350702 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-os-release\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.351007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350734 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-run-netns\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350802 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrvxb\" (UniqueName: \"kubernetes.io/projected/5cc754c0-fce1-4135-a603-509ef613e62d-kube-api-access-qrvxb\") pod \"iptables-alerter-288np\" (UID: \"5cc754c0-fce1-4135-a603-509ef613e62d\") " pod="openshift-network-operator/iptables-alerter-288np" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350828 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-systemd-units\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350853 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68711042-2c0d-43ee-aac6-684c532f8d59-os-release\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350877 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-modprobe-d\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350905 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1e0c938-ec29-416f-8376-93b93ff2d991-ovnkube-config\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350938 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1e0c938-ec29-416f-8376-93b93ff2d991-env-overrides\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350962 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-sys-fs\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.350988 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/68711042-2c0d-43ee-aac6-684c532f8d59-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.351013 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7cbedbda-69db-41ea-8652-f7e83cd6b251-agent-certs\") pod \"konnectivity-agent-lhcn8\" (UID: \"7cbedbda-69db-41ea-8652-f7e83cd6b251\") " pod="kube-system/konnectivity-agent-lhcn8" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.351038 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-var-lib-kubelet\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.351071 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs\") pod \"network-metrics-daemon-zqp7l\" (UID: \"ddd581ca-fe5d-4e33-965d-ad198f8af209\") " pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.351094 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-slash\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.351117 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-etc-openvswitch\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.351141 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-run-ovn-kubernetes\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.351218 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5srt\" (UniqueName: \"kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt\") pod \"network-check-target-m9nk2\" (UID: \"ac26e9c0-3977-40b9-a44e-d694b6663276\") " pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:16:54.351548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.351251 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10fe30c1-99bf-4857-b080-8aafa2ee3910-cni-binary-copy\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.395045 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.395018 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:11:53 +0000 UTC" deadline="2027-10-30 13:25:16.931882177 +0000 UTC" Apr 24 21:16:54.395146 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.395058 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13288h8m22.536840937s" Apr 24 21:16:54.435848 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.435821 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-118.ec2.internal" event={"ID":"0eda8f5a4a0c114d01bb72ca7f3afc81","Type":"ContainerStarted","Data":"60d8a965ed8464b6a888ab81894fbeaa6fda6e6a688947c279f3737acd585c64"} Apr 24 21:16:54.452044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452023 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.452132 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452057 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-var-lib-cni-multus\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.452132 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452083 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-run-multus-certs\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.452208 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452159 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.452247 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452210 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-var-lib-cni-multus\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.452299 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452279 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-run-multus-certs\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.452331 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452286 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ptzf\" (UniqueName: \"kubernetes.io/projected/e1e0c938-ec29-416f-8376-93b93ff2d991-kube-api-access-2ptzf\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.452378 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452326 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-os-release\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.452378 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-run-netns\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.452474 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452378 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrvxb\" (UniqueName: \"kubernetes.io/projected/5cc754c0-fce1-4135-a603-509ef613e62d-kube-api-access-qrvxb\") pod \"iptables-alerter-288np\" (UID: \"5cc754c0-fce1-4135-a603-509ef613e62d\") " pod="openshift-network-operator/iptables-alerter-288np" Apr 24 21:16:54.452474 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452429 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-os-release\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.452474 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452422 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-run-netns\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.452474 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452472 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-systemd-units\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.452635 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452499 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68711042-2c0d-43ee-aac6-684c532f8d59-os-release\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.452635 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452525 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-modprobe-d\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.452635 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452565 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-systemd-units\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.452635 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452565 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1e0c938-ec29-416f-8376-93b93ff2d991-ovnkube-config\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.452635 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452618 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1e0c938-ec29-416f-8376-93b93ff2d991-env-overrides\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.452941 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452644 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-sys-fs\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.452941 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452663 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68711042-2c0d-43ee-aac6-684c532f8d59-os-release\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.452941 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452661 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/68711042-2c0d-43ee-aac6-684c532f8d59-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.452941 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452715 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7cbedbda-69db-41ea-8652-f7e83cd6b251-agent-certs\") pod \"konnectivity-agent-lhcn8\" (UID: \"7cbedbda-69db-41ea-8652-f7e83cd6b251\") " pod="kube-system/konnectivity-agent-lhcn8" Apr 24 21:16:54.452941 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452743 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-var-lib-kubelet\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.452941 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452782 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs\") pod \"network-metrics-daemon-zqp7l\" (UID: \"ddd581ca-fe5d-4e33-965d-ad198f8af209\") " pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:16:54.452941 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452808 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-slash\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.452941 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452843 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-etc-openvswitch\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.452941 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452868 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-run-ovn-kubernetes\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.452941 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452893 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5srt\" (UniqueName: \"kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt\") pod \"network-check-target-m9nk2\" (UID: \"ac26e9c0-3977-40b9-a44e-d694b6663276\") " pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:16:54.452941 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452918 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10fe30c1-99bf-4857-b080-8aafa2ee3910-cni-binary-copy\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.452941 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452943 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zl94h\" (UniqueName: \"kubernetes.io/projected/ddd581ca-fe5d-4e33-965d-ad198f8af209-kube-api-access-zl94h\") pod \"network-metrics-daemon-zqp7l\" (UID: \"ddd581ca-fe5d-4e33-965d-ad198f8af209\") " pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452968 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-sys\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.452996 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-run-openvswitch\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453020 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-node-log\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453044 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-sysconfig\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453068 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-lib-modules\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453105 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1e0c938-ec29-416f-8376-93b93ff2d991-env-overrides\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453111 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-var-lib-kubelet\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453163 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-var-lib-kubelet\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453190 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5cc754c0-fce1-4135-a603-509ef613e62d-host-slash\") pod \"iptables-alerter-288np\" (UID: \"5cc754c0-fce1-4135-a603-509ef613e62d\") " pod="openshift-network-operator/iptables-alerter-288np" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453206 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1e0c938-ec29-416f-8376-93b93ff2d991-ovnkube-config\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453220 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-device-dir\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453224 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/68711042-2c0d-43ee-aac6-684c532f8d59-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453245 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-system-cni-dir\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453272 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-systemd\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453290 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-sys-fs\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453296 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-tuned\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.453470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453321 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c591371-ce4e-4fa2-8ae0-1f25308bf023-tmp\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453336 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-run-openvswitch\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453346 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4rfr\" (UniqueName: \"kubernetes.io/projected/8c591371-ce4e-4fa2-8ae0-1f25308bf023-kube-api-access-g4rfr\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453371 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnqlj\" (UniqueName: \"kubernetes.io/projected/c4045dfd-b9dc-46c0-9964-9335ef05615d-kube-api-access-fnqlj\") pod \"node-ca-9s62d\" (UID: \"c4045dfd-b9dc-46c0-9964-9335ef05615d\") " pod="openshift-image-registry/node-ca-9s62d" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453398 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-var-lib-openvswitch\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453422 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-multus-cni-dir\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453447 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-run-k8s-cni-cncf-io\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453473 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-etc-kubernetes\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453499 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-sysctl-conf\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453521 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4045dfd-b9dc-46c0-9964-9335ef05615d-host\") pod \"node-ca-9s62d\" (UID: \"c4045dfd-b9dc-46c0-9964-9335ef05615d\") " pod="openshift-image-registry/node-ca-9s62d" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453545 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-log-socket\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453571 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-cni-bin\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453586 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-sys\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453596 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68711042-2c0d-43ee-aac6-684c532f8d59-cnibin\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453622 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68711042-2c0d-43ee-aac6-684c532f8d59-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453631 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-slash\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453647 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-sysconfig\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.454250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453649 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5cc754c0-fce1-4135-a603-509ef613e62d-iptables-alerter-script\") pod \"iptables-alerter-288np\" (UID: \"5cc754c0-fce1-4135-a603-509ef613e62d\") " pod="openshift-network-operator/iptables-alerter-288np" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453684 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-kubelet\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453694 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5cc754c0-fce1-4135-a603-509ef613e62d-host-slash\") pod \"iptables-alerter-288np\" (UID: \"5cc754c0-fce1-4135-a603-509ef613e62d\") " pod="openshift-network-operator/iptables-alerter-288np" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453676 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453715 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-run-ovn\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453273 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-modprobe-d\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453775 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10fe30c1-99bf-4857-b080-8aafa2ee3910-cni-binary-copy\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:54.453813 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453851 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-kubelet\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453880 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-system-cni-dir\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453894 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-run-ovn-kubernetes\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453915 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-cni-netd\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453948 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zr4n4\" (UniqueName: \"kubernetes.io/projected/68711042-2c0d-43ee-aac6-684c532f8d59-kube-api-access-zr4n4\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453976 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-registration-dir\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.454005 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68711042-2c0d-43ee-aac6-684c532f8d59-cni-binary-copy\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.454015 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-sysctl-conf\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.454024 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4045dfd-b9dc-46c0-9964-9335ef05615d-host\") pod \"node-ca-9s62d\" (UID: \"c4045dfd-b9dc-46c0-9964-9335ef05615d\") " pod="openshift-image-registry/node-ca-9s62d" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.454033 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68711042-2c0d-43ee-aac6-684c532f8d59-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.455044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.453683 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-var-lib-kubelet\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.454104 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-device-dir\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.454212 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-cni-bin\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.454222 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5cc754c0-fce1-4135-a603-509ef613e62d-iptables-alerter-script\") pod \"iptables-alerter-288np\" (UID: \"5cc754c0-fce1-4135-a603-509ef613e62d\") " pod="openshift-network-operator/iptables-alerter-288np" Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.454582 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68711042-2c0d-43ee-aac6-684c532f8d59-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.454601 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7cbedbda-69db-41ea-8652-f7e83cd6b251-konnectivity-ca\") pod \"konnectivity-agent-lhcn8\" (UID: \"7cbedbda-69db-41ea-8652-f7e83cd6b251\") " pod="kube-system/konnectivity-agent-lhcn8" Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:54.454649 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs podName:ddd581ca-fe5d-4e33-965d-ad198f8af209 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:54.954618074 +0000 UTC m=+3.089016681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs") pod "network-metrics-daemon-zqp7l" (UID: "ddd581ca-fe5d-4e33-965d-ad198f8af209") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.454914 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-var-lib-openvswitch\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.454932 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-run\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.454960 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-log-socket\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.454971 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-host\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455002 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1e0c938-ec29-416f-8376-93b93ff2d991-ovn-node-metrics-cert\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455066 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-kubernetes\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455084 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68711042-2c0d-43ee-aac6-684c532f8d59-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455095 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-socket-dir\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455122 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-run-ovn\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.455918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455122 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5zsk\" (UniqueName: \"kubernetes.io/projected/84bd406c-811a-4762-8450-a7fabd1f8bad-kube-api-access-r5zsk\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455161 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-multus-conf-dir\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455172 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-node-log\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455206 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-cni-netd\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455236 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-systemd\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455525 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-kubernetes\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455575 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-host\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455585 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-run\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455616 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-sysctl-d\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455647 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c4045dfd-b9dc-46c0-9964-9335ef05615d-serviceca\") pod \"node-ca-9s62d\" (UID: \"c4045dfd-b9dc-46c0-9964-9335ef05615d\") " pod="openshift-image-registry/node-ca-9s62d" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455697 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-socket-dir\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455699 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68711042-2c0d-43ee-aac6-684c532f8d59-system-cni-dir\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455729 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68711042-2c0d-43ee-aac6-684c532f8d59-system-cni-dir\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455736 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-multus-socket-dir-parent\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455782 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/10fe30c1-99bf-4857-b080-8aafa2ee3910-multus-daemon-config\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455837 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-run-systemd\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455864 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455887 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-multus-conf-dir\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.456695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455897 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-etc-selinux\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455944 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-cnibin\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.455975 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-var-lib-cni-bin\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456001 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-hostroot\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456005 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-etc-selinux\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456027 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmcxk\" (UniqueName: \"kubernetes.io/projected/10fe30c1-99bf-4857-b080-8aafa2ee3910-kube-api-access-tmcxk\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456054 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-run-netns\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456100 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-run-netns\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456117 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-var-lib-cni-bin\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456139 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-hostroot\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456156 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-run-systemd\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456185 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-cnibin\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456203 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-multus-socket-dir-parent\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456238 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1e0c938-ec29-416f-8376-93b93ff2d991-ovnkube-script-lib\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456328 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c4045dfd-b9dc-46c0-9964-9335ef05615d-serviceca\") pod \"node-ca-9s62d\" (UID: \"c4045dfd-b9dc-46c0-9964-9335ef05615d\") " pod="openshift-image-registry/node-ca-9s62d" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456401 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456479 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-sysctl-d\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456526 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/10fe30c1-99bf-4857-b080-8aafa2ee3910-multus-daemon-config\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.457626 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456575 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/84bd406c-811a-4762-8450-a7fabd1f8bad-registration-dir\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.458498 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456643 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1e0c938-ec29-416f-8376-93b93ff2d991-etc-openvswitch\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.458498 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456680 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-etc-kubernetes\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.458498 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456684 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68711042-2c0d-43ee-aac6-684c532f8d59-cnibin\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.458498 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456727 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-multus-cni-dir\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.458498 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456734 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/10fe30c1-99bf-4857-b080-8aafa2ee3910-host-run-k8s-cni-cncf-io\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.458498 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456771 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1e0c938-ec29-416f-8376-93b93ff2d991-ovnkube-script-lib\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.458498 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.457073 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68711042-2c0d-43ee-aac6-684c532f8d59-cni-binary-copy\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.458498 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.457212 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7cbedbda-69db-41ea-8652-f7e83cd6b251-konnectivity-ca\") pod \"konnectivity-agent-lhcn8\" (UID: \"7cbedbda-69db-41ea-8652-f7e83cd6b251\") " pod="kube-system/konnectivity-agent-lhcn8" Apr 24 21:16:54.458498 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.456887 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8c591371-ce4e-4fa2-8ae0-1f25308bf023-lib-modules\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.458498 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.457853 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1e0c938-ec29-416f-8376-93b93ff2d991-ovn-node-metrics-cert\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.458498 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.458047 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7cbedbda-69db-41ea-8652-f7e83cd6b251-agent-certs\") pod \"konnectivity-agent-lhcn8\" (UID: \"7cbedbda-69db-41ea-8652-f7e83cd6b251\") " pod="kube-system/konnectivity-agent-lhcn8" Apr 24 21:16:54.459042 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.458939 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c591371-ce4e-4fa2-8ae0-1f25308bf023-tmp\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.459042 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.458977 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8c591371-ce4e-4fa2-8ae0-1f25308bf023-etc-tuned\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.466102 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.466077 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ptzf\" (UniqueName: \"kubernetes.io/projected/e1e0c938-ec29-416f-8376-93b93ff2d991-kube-api-access-2ptzf\") pod \"ovnkube-node-fsxch\" (UID: \"e1e0c938-ec29-416f-8376-93b93ff2d991\") " pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.466813 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.466799 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrvxb\" (UniqueName: \"kubernetes.io/projected/5cc754c0-fce1-4135-a603-509ef613e62d-kube-api-access-qrvxb\") pod \"iptables-alerter-288np\" (UID: \"5cc754c0-fce1-4135-a603-509ef613e62d\") " pod="openshift-network-operator/iptables-alerter-288np" Apr 24 21:16:54.471909 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:54.471890 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:54.471973 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:54.471914 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:54.471973 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:54.471927 2569 projected.go:194] Error preparing data for projected volume kube-api-access-k5srt for pod openshift-network-diagnostics/network-check-target-m9nk2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:54.472071 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:54.471992 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt podName:ac26e9c0-3977-40b9-a44e-d694b6663276 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:54.971976189 +0000 UTC m=+3.106374805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-k5srt" (UniqueName: "kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt") pod "network-check-target-m9nk2" (UID: "ac26e9c0-3977-40b9-a44e-d694b6663276") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:54.479485 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.479457 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5zsk\" (UniqueName: \"kubernetes.io/projected/84bd406c-811a-4762-8450-a7fabd1f8bad-kube-api-access-r5zsk\") pod \"aws-ebs-csi-driver-node-jthzc\" (UID: \"84bd406c-811a-4762-8450-a7fabd1f8bad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.479573 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.479524 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr4n4\" (UniqueName: \"kubernetes.io/projected/68711042-2c0d-43ee-aac6-684c532f8d59-kube-api-access-zr4n4\") pod \"multus-additional-cni-plugins-bhz9g\" (UID: \"68711042-2c0d-43ee-aac6-684c532f8d59\") " pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.479573 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.479539 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl94h\" (UniqueName: \"kubernetes.io/projected/ddd581ca-fe5d-4e33-965d-ad198f8af209-kube-api-access-zl94h\") pod \"network-metrics-daemon-zqp7l\" (UID: \"ddd581ca-fe5d-4e33-965d-ad198f8af209\") " pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:16:54.479907 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.479888 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnqlj\" (UniqueName: \"kubernetes.io/projected/c4045dfd-b9dc-46c0-9964-9335ef05615d-kube-api-access-fnqlj\") pod \"node-ca-9s62d\" (UID: \"c4045dfd-b9dc-46c0-9964-9335ef05615d\") " pod="openshift-image-registry/node-ca-9s62d" Apr 24 21:16:54.480931 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.480912 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4rfr\" (UniqueName: \"kubernetes.io/projected/8c591371-ce4e-4fa2-8ae0-1f25308bf023-kube-api-access-g4rfr\") pod \"tuned-djgd9\" (UID: \"8c591371-ce4e-4fa2-8ae0-1f25308bf023\") " pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.481013 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.480997 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmcxk\" (UniqueName: \"kubernetes.io/projected/10fe30c1-99bf-4857-b080-8aafa2ee3910-kube-api-access-tmcxk\") pod \"multus-nwzth\" (UID: \"10fe30c1-99bf-4857-b080-8aafa2ee3910\") " pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.636071 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.635987 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9s62d" Apr 24 21:16:54.644916 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.644889 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-288np" Apr 24 21:16:54.653280 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.653262 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:16:54.658211 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.658192 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" Apr 24 21:16:54.658328 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.658217 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:54.664657 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.664640 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bhz9g" Apr 24 21:16:54.671168 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.671148 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nwzth" Apr 24 21:16:54.678647 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.678621 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lhcn8" Apr 24 21:16:54.684135 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.684117 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-djgd9" Apr 24 21:16:54.959085 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:54.959010 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs\") pod \"network-metrics-daemon-zqp7l\" (UID: \"ddd581ca-fe5d-4e33-965d-ad198f8af209\") " pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:16:54.959213 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:54.959134 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:54.959213 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:54.959184 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs podName:ddd581ca-fe5d-4e33-965d-ad198f8af209 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:55.959169674 +0000 UTC m=+4.093568273 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs") pod "network-metrics-daemon-zqp7l" (UID: "ddd581ca-fe5d-4e33-965d-ad198f8af209") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:55.003063 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:55.003023 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cbedbda_69db_41ea_8652_f7e83cd6b251.slice/crio-379f2302c9dd7d68860cd595194e01a88b506a6b01d4667b73e3c124e591c1b0 WatchSource:0}: Error finding container 379f2302c9dd7d68860cd595194e01a88b506a6b01d4667b73e3c124e591c1b0: Status 404 returned error can't find the container with id 379f2302c9dd7d68860cd595194e01a88b506a6b01d4667b73e3c124e591c1b0 Apr 24 21:16:55.004145 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:55.004120 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4045dfd_b9dc_46c0_9964_9335ef05615d.slice/crio-e7cf2c18b764f1358d158a643dd2dd2ce9f2a527ccf14fd9a3a3db6d712f5be0 WatchSource:0}: Error finding container e7cf2c18b764f1358d158a643dd2dd2ce9f2a527ccf14fd9a3a3db6d712f5be0: Status 404 returned error can't find the container with id e7cf2c18b764f1358d158a643dd2dd2ce9f2a527ccf14fd9a3a3db6d712f5be0 Apr 24 21:16:55.005511 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:55.005016 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10fe30c1_99bf_4857_b080_8aafa2ee3910.slice/crio-c7d463433ffcf38cbd0a48a7e9e7cf37ef40016ed883e3484d3658af51c3d442 WatchSource:0}: Error finding container c7d463433ffcf38cbd0a48a7e9e7cf37ef40016ed883e3484d3658af51c3d442: Status 404 returned error can't find the container with id c7d463433ffcf38cbd0a48a7e9e7cf37ef40016ed883e3484d3658af51c3d442 Apr 24 21:16:55.008804 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:55.008683 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68711042_2c0d_43ee_aac6_684c532f8d59.slice/crio-ffa6d36948ade64b062e9d282fd9aa858a9e4fb289a2b8d5115b35964fa9f624 WatchSource:0}: Error finding container ffa6d36948ade64b062e9d282fd9aa858a9e4fb289a2b8d5115b35964fa9f624: Status 404 returned error can't find the container with id ffa6d36948ade64b062e9d282fd9aa858a9e4fb289a2b8d5115b35964fa9f624 Apr 24 21:16:55.009787 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:55.009749 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84bd406c_811a_4762_8450_a7fabd1f8bad.slice/crio-abc24157f1b02e7d9af1d7adb02c89c24deb7e88dcc3e7674396f0df6bfd4685 WatchSource:0}: Error finding container abc24157f1b02e7d9af1d7adb02c89c24deb7e88dcc3e7674396f0df6bfd4685: Status 404 returned error can't find the container with id abc24157f1b02e7d9af1d7adb02c89c24deb7e88dcc3e7674396f0df6bfd4685 Apr 24 21:16:55.010898 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:55.010591 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c591371_ce4e_4fa2_8ae0_1f25308bf023.slice/crio-fc18c74f88a67fea9ed97bd736dda9ccc72cb74b7a40c67a153f6269db9e9913 WatchSource:0}: Error finding container fc18c74f88a67fea9ed97bd736dda9ccc72cb74b7a40c67a153f6269db9e9913: Status 404 returned error can't find the container with id fc18c74f88a67fea9ed97bd736dda9ccc72cb74b7a40c67a153f6269db9e9913 Apr 24 21:16:55.011332 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:55.011313 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1e0c938_ec29_416f_8376_93b93ff2d991.slice/crio-eea832fa03540273dbdd38f32369e7728e1282f9c3d11bb7bbc2d1139ae56281 WatchSource:0}: Error finding container eea832fa03540273dbdd38f32369e7728e1282f9c3d11bb7bbc2d1139ae56281: Status 404 returned error can't find the container with id eea832fa03540273dbdd38f32369e7728e1282f9c3d11bb7bbc2d1139ae56281 Apr 24 21:16:55.012790 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:55.012748 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cc754c0_fce1_4135_a603_509ef613e62d.slice/crio-87a80b28efab14389f298fd275dba73b82494341d22a38a89d3688cefba47347 WatchSource:0}: Error finding container 87a80b28efab14389f298fd275dba73b82494341d22a38a89d3688cefba47347: Status 404 returned error can't find the container with id 87a80b28efab14389f298fd275dba73b82494341d22a38a89d3688cefba47347 Apr 24 21:16:55.059821 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.059691 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5srt\" (UniqueName: \"kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt\") pod \"network-check-target-m9nk2\" (UID: \"ac26e9c0-3977-40b9-a44e-d694b6663276\") " pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:16:55.059887 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:55.059835 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:55.059887 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:55.059852 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:55.059887 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:55.059863 2569 projected.go:194] Error preparing data for projected volume kube-api-access-k5srt for pod openshift-network-diagnostics/network-check-target-m9nk2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:55.060005 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:55.059904 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt podName:ac26e9c0-3977-40b9-a44e-d694b6663276 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.059890488 +0000 UTC m=+4.194289086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5srt" (UniqueName: "kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt") pod "network-check-target-m9nk2" (UID: "ac26e9c0-3977-40b9-a44e-d694b6663276") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:55.395667 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.395548 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:11:53 +0000 UTC" deadline="2027-12-22 14:25:53.05158796 +0000 UTC" Apr 24 21:16:55.395667 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.395593 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14561h8m57.655999579s" Apr 24 21:16:55.453776 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.450081 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-118.ec2.internal" event={"ID":"0eda8f5a4a0c114d01bb72ca7f3afc81","Type":"ContainerStarted","Data":"b5bc4cbaa315b276222a57a28b5de648f5e94bf475b3fa37e525258df139b29b"} Apr 24 21:16:55.460971 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.460898 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" event={"ID":"e1e0c938-ec29-416f-8376-93b93ff2d991","Type":"ContainerStarted","Data":"eea832fa03540273dbdd38f32369e7728e1282f9c3d11bb7bbc2d1139ae56281"} Apr 24 21:16:55.467092 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.467046 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-288np" event={"ID":"5cc754c0-fce1-4135-a603-509ef613e62d","Type":"ContainerStarted","Data":"87a80b28efab14389f298fd275dba73b82494341d22a38a89d3688cefba47347"} Apr 24 21:16:55.471366 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.471339 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-djgd9" event={"ID":"8c591371-ce4e-4fa2-8ae0-1f25308bf023","Type":"ContainerStarted","Data":"fc18c74f88a67fea9ed97bd736dda9ccc72cb74b7a40c67a153f6269db9e9913"} Apr 24 21:16:55.473628 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.473602 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bhz9g" event={"ID":"68711042-2c0d-43ee-aac6-684c532f8d59","Type":"ContainerStarted","Data":"ffa6d36948ade64b062e9d282fd9aa858a9e4fb289a2b8d5115b35964fa9f624"} Apr 24 21:16:55.475888 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.475866 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9s62d" event={"ID":"c4045dfd-b9dc-46c0-9964-9335ef05615d","Type":"ContainerStarted","Data":"e7cf2c18b764f1358d158a643dd2dd2ce9f2a527ccf14fd9a3a3db6d712f5be0"} Apr 24 21:16:55.482266 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.482222 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" event={"ID":"84bd406c-811a-4762-8450-a7fabd1f8bad","Type":"ContainerStarted","Data":"abc24157f1b02e7d9af1d7adb02c89c24deb7e88dcc3e7674396f0df6bfd4685"} Apr 24 21:16:55.488324 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.488301 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nwzth" event={"ID":"10fe30c1-99bf-4857-b080-8aafa2ee3910","Type":"ContainerStarted","Data":"c7d463433ffcf38cbd0a48a7e9e7cf37ef40016ed883e3484d3658af51c3d442"} Apr 24 21:16:55.494431 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.494408 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lhcn8" event={"ID":"7cbedbda-69db-41ea-8652-f7e83cd6b251","Type":"ContainerStarted","Data":"379f2302c9dd7d68860cd595194e01a88b506a6b01d4667b73e3c124e591c1b0"} Apr 24 21:16:55.873858 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.873744 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-118.ec2.internal" podStartSLOduration=1.873722302 podStartE2EDuration="1.873722302s" podCreationTimestamp="2026-04-24 21:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:55.486704254 +0000 UTC m=+3.621102876" watchObservedRunningTime="2026-04-24 21:16:55.873722302 +0000 UTC m=+4.008120925" Apr 24 21:16:55.874589 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.874269 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hpsfh"] Apr 24 21:16:55.876799 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.876775 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hpsfh" Apr 24 21:16:55.887445 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.887423 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4rdlg\"" Apr 24 21:16:55.887810 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.887790 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:16:55.888527 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.888504 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:16:55.967124 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.967099 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f3244bf-6a30-4809-b6e8-2f27daaaf6ae-tmp-dir\") pod \"node-resolver-hpsfh\" (UID: \"7f3244bf-6a30-4809-b6e8-2f27daaaf6ae\") " pod="openshift-dns/node-resolver-hpsfh" Apr 24 21:16:55.967238 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.967141 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7brk\" (UniqueName: \"kubernetes.io/projected/7f3244bf-6a30-4809-b6e8-2f27daaaf6ae-kube-api-access-w7brk\") pod \"node-resolver-hpsfh\" (UID: \"7f3244bf-6a30-4809-b6e8-2f27daaaf6ae\") " pod="openshift-dns/node-resolver-hpsfh" Apr 24 21:16:55.967238 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.967183 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7f3244bf-6a30-4809-b6e8-2f27daaaf6ae-hosts-file\") pod \"node-resolver-hpsfh\" (UID: \"7f3244bf-6a30-4809-b6e8-2f27daaaf6ae\") " pod="openshift-dns/node-resolver-hpsfh" Apr 24 21:16:55.967238 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:55.967226 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs\") pod \"network-metrics-daemon-zqp7l\" (UID: \"ddd581ca-fe5d-4e33-965d-ad198f8af209\") " pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:16:55.967392 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:55.967354 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:55.967437 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:55.967410 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs podName:ddd581ca-fe5d-4e33-965d-ad198f8af209 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:57.967391498 +0000 UTC m=+6.101790098 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs") pod "network-metrics-daemon-zqp7l" (UID: "ddd581ca-fe5d-4e33-965d-ad198f8af209") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:56.068716 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:56.067956 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7f3244bf-6a30-4809-b6e8-2f27daaaf6ae-hosts-file\") pod \"node-resolver-hpsfh\" (UID: \"7f3244bf-6a30-4809-b6e8-2f27daaaf6ae\") " pod="openshift-dns/node-resolver-hpsfh" Apr 24 21:16:56.068716 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:56.068034 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5srt\" (UniqueName: \"kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt\") pod \"network-check-target-m9nk2\" (UID: \"ac26e9c0-3977-40b9-a44e-d694b6663276\") " pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:16:56.068716 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:56.068073 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f3244bf-6a30-4809-b6e8-2f27daaaf6ae-tmp-dir\") pod \"node-resolver-hpsfh\" (UID: \"7f3244bf-6a30-4809-b6e8-2f27daaaf6ae\") " pod="openshift-dns/node-resolver-hpsfh" Apr 24 21:16:56.068716 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:56.068090 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7f3244bf-6a30-4809-b6e8-2f27daaaf6ae-hosts-file\") pod \"node-resolver-hpsfh\" (UID: \"7f3244bf-6a30-4809-b6e8-2f27daaaf6ae\") " pod="openshift-dns/node-resolver-hpsfh" Apr 24 21:16:56.068716 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:56.068102 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7brk\" (UniqueName: \"kubernetes.io/projected/7f3244bf-6a30-4809-b6e8-2f27daaaf6ae-kube-api-access-w7brk\") pod \"node-resolver-hpsfh\" (UID: \"7f3244bf-6a30-4809-b6e8-2f27daaaf6ae\") " pod="openshift-dns/node-resolver-hpsfh" Apr 24 21:16:56.068716 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:56.068233 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:56.068716 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:56.068251 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:56.068716 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:56.068264 2569 projected.go:194] Error preparing data for projected volume kube-api-access-k5srt for pod openshift-network-diagnostics/network-check-target-m9nk2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:56.068716 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:56.068317 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt podName:ac26e9c0-3977-40b9-a44e-d694b6663276 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:58.068297983 +0000 UTC m=+6.202696589 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5srt" (UniqueName: "kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt") pod "network-check-target-m9nk2" (UID: "ac26e9c0-3977-40b9-a44e-d694b6663276") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:56.068716 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:56.068676 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f3244bf-6a30-4809-b6e8-2f27daaaf6ae-tmp-dir\") pod \"node-resolver-hpsfh\" (UID: \"7f3244bf-6a30-4809-b6e8-2f27daaaf6ae\") " pod="openshift-dns/node-resolver-hpsfh" Apr 24 21:16:56.092350 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:56.092322 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7brk\" (UniqueName: \"kubernetes.io/projected/7f3244bf-6a30-4809-b6e8-2f27daaaf6ae-kube-api-access-w7brk\") pod \"node-resolver-hpsfh\" (UID: \"7f3244bf-6a30-4809-b6e8-2f27daaaf6ae\") " pod="openshift-dns/node-resolver-hpsfh" Apr 24 21:16:56.200171 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:56.200097 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hpsfh" Apr 24 21:16:56.221267 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:16:56.221235 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f3244bf_6a30_4809_b6e8_2f27daaaf6ae.slice/crio-8ee278313b0f2b49f7d579cb761427ea127b58fff427605a8f051d495f8f892c WatchSource:0}: Error finding container 8ee278313b0f2b49f7d579cb761427ea127b58fff427605a8f051d495f8f892c: Status 404 returned error can't find the container with id 8ee278313b0f2b49f7d579cb761427ea127b58fff427605a8f051d495f8f892c Apr 24 21:16:56.434239 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:56.434211 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:16:56.434654 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:56.434332 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:16:56.434921 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:56.434728 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:16:56.434921 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:56.434859 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:16:56.506384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:56.506304 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hpsfh" event={"ID":"7f3244bf-6a30-4809-b6e8-2f27daaaf6ae","Type":"ContainerStarted","Data":"8ee278313b0f2b49f7d579cb761427ea127b58fff427605a8f051d495f8f892c"} Apr 24 21:16:56.523455 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:56.522389 2569 generic.go:358] "Generic (PLEG): container finished" podID="22cd6bac03b476ac7b2ad7c86b5343a4" containerID="383368573fc2aff02f49f8d7ecb84459273e3ae8972d8da9f42b262d3e9329a7" exitCode=0 Apr 24 21:16:56.523455 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:56.523252 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal" event={"ID":"22cd6bac03b476ac7b2ad7c86b5343a4","Type":"ContainerDied","Data":"383368573fc2aff02f49f8d7ecb84459273e3ae8972d8da9f42b262d3e9329a7"} Apr 24 21:16:57.529774 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:57.529689 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal" event={"ID":"22cd6bac03b476ac7b2ad7c86b5343a4","Type":"ContainerStarted","Data":"d6e70626edc9d5d5c00d7c18ab5b2d244c4eca60b45fd57be7c31122ea1f87b8"} Apr 24 21:16:57.981310 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:57.981162 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs\") pod \"network-metrics-daemon-zqp7l\" (UID: \"ddd581ca-fe5d-4e33-965d-ad198f8af209\") " pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:16:57.981470 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:57.981316 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:57.981470 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:57.981393 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs podName:ddd581ca-fe5d-4e33-965d-ad198f8af209 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:01.98137317 +0000 UTC m=+10.115771774 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs") pod "network-metrics-daemon-zqp7l" (UID: "ddd581ca-fe5d-4e33-965d-ad198f8af209") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:58.082198 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:58.081948 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5srt\" (UniqueName: \"kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt\") pod \"network-check-target-m9nk2\" (UID: \"ac26e9c0-3977-40b9-a44e-d694b6663276\") " pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:16:58.082198 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:58.082105 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:58.082198 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:58.082126 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:58.082198 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:58.082138 2569 projected.go:194] Error preparing data for projected volume kube-api-access-k5srt for pod openshift-network-diagnostics/network-check-target-m9nk2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:58.082198 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:58.082204 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt podName:ac26e9c0-3977-40b9-a44e-d694b6663276 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:02.082186463 +0000 UTC m=+10.216585061 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5srt" (UniqueName: "kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt") pod "network-check-target-m9nk2" (UID: "ac26e9c0-3977-40b9-a44e-d694b6663276") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:58.434940 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:58.434855 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:16:58.435128 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:58.434975 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:16:58.435587 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:16:58.435383 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:16:58.435587 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:16:58.435490 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:00.432501 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:00.432215 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:00.432501 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:00.432248 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:00.432501 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:00.432345 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:00.433010 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:00.432508 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:02.015030 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:02.014961 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs\") pod \"network-metrics-daemon-zqp7l\" (UID: \"ddd581ca-fe5d-4e33-965d-ad198f8af209\") " pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:02.015487 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:02.015107 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:02.015487 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:02.015200 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs podName:ddd581ca-fe5d-4e33-965d-ad198f8af209 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:10.015177766 +0000 UTC m=+18.149576368 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs") pod "network-metrics-daemon-zqp7l" (UID: "ddd581ca-fe5d-4e33-965d-ad198f8af209") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:02.115639 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:02.115608 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5srt\" (UniqueName: \"kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt\") pod \"network-check-target-m9nk2\" (UID: \"ac26e9c0-3977-40b9-a44e-d694b6663276\") " pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:02.115823 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:02.115789 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:17:02.115823 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:02.115805 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:17:02.115823 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:02.115814 2569 projected.go:194] Error preparing data for projected volume kube-api-access-k5srt for pod openshift-network-diagnostics/network-check-target-m9nk2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:02.116006 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:02.115863 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt podName:ac26e9c0-3977-40b9-a44e-d694b6663276 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:10.115850084 +0000 UTC m=+18.250248683 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5srt" (UniqueName: "kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt") pod "network-check-target-m9nk2" (UID: "ac26e9c0-3977-40b9-a44e-d694b6663276") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:02.434188 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:02.433683 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:02.434188 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:02.433823 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:02.434188 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:02.434160 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:02.434470 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:02.434240 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:04.431927 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:04.431838 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:04.432334 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:04.431848 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:04.432334 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:04.431960 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:04.432334 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:04.432064 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:06.432029 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:06.431995 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:06.432445 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:06.431997 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:06.432445 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:06.432126 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:06.432445 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:06.432212 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:08.432469 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:08.432432 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:08.432933 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:08.432432 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:08.432933 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:08.432586 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:08.432933 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:08.432617 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:10.075184 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:10.075150 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs\") pod \"network-metrics-daemon-zqp7l\" (UID: \"ddd581ca-fe5d-4e33-965d-ad198f8af209\") " pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:10.075689 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:10.075269 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:10.075689 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:10.075326 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs podName:ddd581ca-fe5d-4e33-965d-ad198f8af209 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:26.075306878 +0000 UTC m=+34.209705481 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs") pod "network-metrics-daemon-zqp7l" (UID: "ddd581ca-fe5d-4e33-965d-ad198f8af209") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:10.176094 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:10.176056 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5srt\" (UniqueName: \"kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt\") pod \"network-check-target-m9nk2\" (UID: \"ac26e9c0-3977-40b9-a44e-d694b6663276\") " pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:10.176309 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:10.176193 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:17:10.176309 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:10.176211 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:17:10.176309 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:10.176220 2569 projected.go:194] Error preparing data for projected volume kube-api-access-k5srt for pod openshift-network-diagnostics/network-check-target-m9nk2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:10.176309 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:10.176285 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt podName:ac26e9c0-3977-40b9-a44e-d694b6663276 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:26.176264916 +0000 UTC m=+34.310663577 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5srt" (UniqueName: "kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt") pod "network-check-target-m9nk2" (UID: "ac26e9c0-3977-40b9-a44e-d694b6663276") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:10.432287 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:10.432215 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:10.432446 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:10.432215 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:10.432446 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:10.432343 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:10.432558 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:10.432436 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:12.435205 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:12.435175 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:12.435545 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:12.435296 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:12.435545 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:12.435451 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:12.435658 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:12.435540 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:12.554215 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:12.554181 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bhz9g" event={"ID":"68711042-2c0d-43ee-aac6-684c532f8d59","Type":"ContainerStarted","Data":"7fa04a212d6d8eaee924e7d3b8fc272810f1bca3b9543c451d3b17a02dc590f4"} Apr 24 21:17:12.555466 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:12.555444 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9s62d" event={"ID":"c4045dfd-b9dc-46c0-9964-9335ef05615d","Type":"ContainerStarted","Data":"9dae0ac49776a85605a13fad260c664d625f2d97b00ef06012cb4a50b700efc9"} Apr 24 21:17:12.556581 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:12.556557 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hpsfh" event={"ID":"7f3244bf-6a30-4809-b6e8-2f27daaaf6ae","Type":"ContainerStarted","Data":"e01deed05698604cf8c69037db23cfa3f64a60f47e3c4636b7c04c6e048ef06b"} Apr 24 21:17:12.557603 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:12.557584 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" event={"ID":"84bd406c-811a-4762-8450-a7fabd1f8bad","Type":"ContainerStarted","Data":"20fd392ab771f5f47aab8091991cfa1a8c2705da9635e354e6eb554e40b4b892"} Apr 24 21:17:12.558837 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:12.558695 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nwzth" event={"ID":"10fe30c1-99bf-4857-b080-8aafa2ee3910","Type":"ContainerStarted","Data":"6a44f109b6ae43375f193245837daf7c0eb58509903f9099caca808aeffbdebf"} Apr 24 21:17:12.559862 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:12.559833 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lhcn8" event={"ID":"7cbedbda-69db-41ea-8652-f7e83cd6b251","Type":"ContainerStarted","Data":"d4c2e7b9bd3e890f5d80dd4a4a0de181e7961fba20d960d66c98a270df31ed94"} Apr 24 21:17:12.561182 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:12.561152 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-djgd9" event={"ID":"8c591371-ce4e-4fa2-8ae0-1f25308bf023","Type":"ContainerStarted","Data":"d4836f969e762d38e83bee3d7fbcf438b4d2ed4dec7676be5c2e9def86308814"} Apr 24 21:17:12.577136 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:12.577103 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-118.ec2.internal" podStartSLOduration=19.577092763 podStartE2EDuration="19.577092763s" podCreationTimestamp="2026-04-24 21:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:57.549804274 +0000 UTC m=+5.684202896" watchObservedRunningTime="2026-04-24 21:17:12.577092763 +0000 UTC m=+20.711491397" Apr 24 21:17:12.590076 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:12.590041 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9s62d" podStartSLOduration=8.059298137 podStartE2EDuration="20.590029851s" podCreationTimestamp="2026-04-24 21:16:52 +0000 UTC" firstStartedPulling="2026-04-24 21:16:55.008264523 +0000 UTC m=+3.142663122" lastFinishedPulling="2026-04-24 21:17:07.538996228 +0000 UTC m=+15.673394836" observedRunningTime="2026-04-24 21:17:12.589847789 +0000 UTC m=+20.724246424" watchObservedRunningTime="2026-04-24 21:17:12.590029851 +0000 UTC m=+20.724428471" Apr 24 21:17:12.606506 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:12.606415 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nwzth" podStartSLOduration=3.356815039 podStartE2EDuration="20.606404692s" podCreationTimestamp="2026-04-24 21:16:52 +0000 UTC" firstStartedPulling="2026-04-24 21:16:55.008361376 +0000 UTC m=+3.142759988" lastFinishedPulling="2026-04-24 21:17:12.257951042 +0000 UTC m=+20.392349641" observedRunningTime="2026-04-24 21:17:12.606091182 +0000 UTC m=+20.740489803" watchObservedRunningTime="2026-04-24 21:17:12.606404692 +0000 UTC m=+20.740803303" Apr 24 21:17:12.640216 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:12.640028 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-djgd9" podStartSLOduration=3.42651884 podStartE2EDuration="20.640012894s" podCreationTimestamp="2026-04-24 21:16:52 +0000 UTC" firstStartedPulling="2026-04-24 21:16:55.012284643 +0000 UTC m=+3.146683250" lastFinishedPulling="2026-04-24 21:17:12.225778699 +0000 UTC m=+20.360177304" observedRunningTime="2026-04-24 21:17:12.621831752 +0000 UTC m=+20.756230372" watchObservedRunningTime="2026-04-24 21:17:12.640012894 +0000 UTC m=+20.774411516" Apr 24 21:17:12.656940 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:12.656885 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-lhcn8" podStartSLOduration=3.46087719 podStartE2EDuration="20.656867491s" podCreationTimestamp="2026-04-24 21:16:52 +0000 UTC" firstStartedPulling="2026-04-24 21:16:55.005547732 +0000 UTC m=+3.139946345" lastFinishedPulling="2026-04-24 21:17:12.201538035 +0000 UTC m=+20.335936646" observedRunningTime="2026-04-24 21:17:12.640927197 +0000 UTC m=+20.775325797" watchObservedRunningTime="2026-04-24 21:17:12.656867491 +0000 UTC m=+20.791266124" Apr 24 21:17:13.403158 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.403133 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:17:13.414187 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.414114 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:17:13.403154068Z","UUID":"6589be80-9f6d-4d4d-a038-4c6156f931e4","Handler":null,"Name":"","Endpoint":""} Apr 24 21:17:13.415399 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.415382 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:17:13.415480 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.415408 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:17:13.564926 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.564900 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" event={"ID":"e1e0c938-ec29-416f-8376-93b93ff2d991","Type":"ContainerStarted","Data":"241874d75d0e80f2d4ab6bc3e9d3ed6f98031707b791313468849b6d6d3fbf1b"} Apr 24 21:17:13.565585 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.564933 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" event={"ID":"e1e0c938-ec29-416f-8376-93b93ff2d991","Type":"ContainerStarted","Data":"d34e2c45d1b9f6c544516adea415b368470a3a085ffd1af357f45664316dba21"} Apr 24 21:17:13.565585 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.564946 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" event={"ID":"e1e0c938-ec29-416f-8376-93b93ff2d991","Type":"ContainerStarted","Data":"47f43c9a222575789e7cdce318526b4bc3067127bb22cc00e796aeb62a11b788"} Apr 24 21:17:13.565585 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.564957 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" event={"ID":"e1e0c938-ec29-416f-8376-93b93ff2d991","Type":"ContainerStarted","Data":"b1f14571964ebee7b826fa956dd0c838f41d37394833884a7ccbe282d292449d"} Apr 24 21:17:13.565585 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.564966 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" event={"ID":"e1e0c938-ec29-416f-8376-93b93ff2d991","Type":"ContainerStarted","Data":"4dd9ef791c96e5f77f11067da7d7163cb5d3596915b6edb1b4fac587021e9d09"} Apr 24 21:17:13.565585 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.564974 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" event={"ID":"e1e0c938-ec29-416f-8376-93b93ff2d991","Type":"ContainerStarted","Data":"a56c3658ee2cd1f564f1c37784825642e96473f8f9732a93acda8d288ae702aa"} Apr 24 21:17:13.566146 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.566121 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-288np" event={"ID":"5cc754c0-fce1-4135-a603-509ef613e62d","Type":"ContainerStarted","Data":"dae81aaf5b99dde0c245deef499bcdb153d65eee040c1cd1a0a73d8e8d561bc2"} Apr 24 21:17:13.567421 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.567402 2569 generic.go:358] "Generic (PLEG): container finished" podID="68711042-2c0d-43ee-aac6-684c532f8d59" containerID="7fa04a212d6d8eaee924e7d3b8fc272810f1bca3b9543c451d3b17a02dc590f4" exitCode=0 Apr 24 21:17:13.567494 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.567466 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bhz9g" event={"ID":"68711042-2c0d-43ee-aac6-684c532f8d59","Type":"ContainerDied","Data":"7fa04a212d6d8eaee924e7d3b8fc272810f1bca3b9543c451d3b17a02dc590f4"} Apr 24 21:17:13.569051 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.569029 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" event={"ID":"84bd406c-811a-4762-8450-a7fabd1f8bad","Type":"ContainerStarted","Data":"1e72f3244787da2bdb3698b930afb76fb83aa29b48e951755f5d98179386f76b"} Apr 24 21:17:13.584845 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.584795 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hpsfh" podStartSLOduration=2.608058224 podStartE2EDuration="18.584778776s" podCreationTimestamp="2026-04-24 21:16:55 +0000 UTC" firstStartedPulling="2026-04-24 21:16:56.224649598 +0000 UTC m=+4.359048203" lastFinishedPulling="2026-04-24 21:17:12.201370143 +0000 UTC m=+20.335768755" observedRunningTime="2026-04-24 21:17:12.659599661 +0000 UTC m=+20.793998282" watchObservedRunningTime="2026-04-24 21:17:13.584778776 +0000 UTC m=+21.719177407" Apr 24 21:17:13.585075 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:13.585052 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-288np" podStartSLOduration=4.373904957 podStartE2EDuration="21.585045715s" podCreationTimestamp="2026-04-24 21:16:52 +0000 UTC" firstStartedPulling="2026-04-24 21:16:55.014639413 +0000 UTC m=+3.149038022" lastFinishedPulling="2026-04-24 21:17:12.225780173 +0000 UTC m=+20.360178780" observedRunningTime="2026-04-24 21:17:13.584742908 +0000 UTC m=+21.719141533" watchObservedRunningTime="2026-04-24 21:17:13.585045715 +0000 UTC m=+21.719444337" Apr 24 21:17:14.431992 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:14.431913 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:14.432180 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:14.432040 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:14.432180 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:14.432101 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:14.432350 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:14.432321 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:14.572676 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:14.572643 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" event={"ID":"84bd406c-811a-4762-8450-a7fabd1f8bad","Type":"ContainerStarted","Data":"00a9a7b145a1d6375c2d704a41fcb08cc8e1ef54e57b7c0ab1db776b16072c25"} Apr 24 21:17:14.593633 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:14.593592 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jthzc" podStartSLOduration=3.5265338 podStartE2EDuration="22.593580602s" podCreationTimestamp="2026-04-24 21:16:52 +0000 UTC" firstStartedPulling="2026-04-24 21:16:55.012173647 +0000 UTC m=+3.146572257" lastFinishedPulling="2026-04-24 21:17:14.079220444 +0000 UTC m=+22.213619059" observedRunningTime="2026-04-24 21:17:14.593062938 +0000 UTC m=+22.727461560" watchObservedRunningTime="2026-04-24 21:17:14.593580602 +0000 UTC m=+22.727979268" Apr 24 21:17:15.577933 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:15.577765 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" event={"ID":"e1e0c938-ec29-416f-8376-93b93ff2d991","Type":"ContainerStarted","Data":"6e3a5b7440170945d67ac50e6fbfeb17a308d571c42080c1c0a653b507f358bd"} Apr 24 21:17:16.435535 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:16.435500 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:16.435535 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:16.435521 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:16.435804 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:16.435626 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:16.435804 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:16.435781 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:17.254217 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:17.254104 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-lhcn8" Apr 24 21:17:17.254951 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:17.254823 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-lhcn8" Apr 24 21:17:17.585449 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:17.585318 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" event={"ID":"e1e0c938-ec29-416f-8376-93b93ff2d991","Type":"ContainerStarted","Data":"55ded9bb21ede2e44ad249f61b6132011005360fbeb032ee4ddf9ac227efe6fb"} Apr 24 21:17:17.585961 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:17.585936 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:17:17.586032 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:17.585977 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-lhcn8" Apr 24 21:17:17.587022 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:17.586737 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-lhcn8" Apr 24 21:17:17.600448 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:17.600430 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:17:17.615987 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:17.615942 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" podStartSLOduration=8.124426863 podStartE2EDuration="25.615929458s" podCreationTimestamp="2026-04-24 21:16:52 +0000 UTC" firstStartedPulling="2026-04-24 21:16:55.014066041 +0000 UTC m=+3.148464645" lastFinishedPulling="2026-04-24 21:17:12.505568642 +0000 UTC m=+20.639967240" observedRunningTime="2026-04-24 21:17:17.614492687 +0000 UTC m=+25.748891308" watchObservedRunningTime="2026-04-24 21:17:17.615929458 +0000 UTC m=+25.750328078" Apr 24 21:17:18.431972 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:18.431946 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:18.431972 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:18.431969 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:18.432471 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:18.432031 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:18.432471 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:18.432151 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:18.588252 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:18.588220 2569 generic.go:358] "Generic (PLEG): container finished" podID="68711042-2c0d-43ee-aac6-684c532f8d59" containerID="b7db8ea1f05829c2c46c22fc1734af69610a85d2a4db0b48ad558cd631e64ea4" exitCode=0 Apr 24 21:17:18.588399 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:18.588256 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bhz9g" event={"ID":"68711042-2c0d-43ee-aac6-684c532f8d59","Type":"ContainerDied","Data":"b7db8ea1f05829c2c46c22fc1734af69610a85d2a4db0b48ad558cd631e64ea4"} Apr 24 21:17:18.588573 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:18.588559 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:17:18.589285 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:18.589030 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:17:18.602929 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:18.602912 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:17:19.591418 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:19.591357 2569 generic.go:358] "Generic (PLEG): container finished" podID="68711042-2c0d-43ee-aac6-684c532f8d59" containerID="665638adcb11a27f8ff8bafe52621b7118aa3543ee8a95308b547de46a712d99" exitCode=0 Apr 24 21:17:19.591808 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:19.591430 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bhz9g" event={"ID":"68711042-2c0d-43ee-aac6-684c532f8d59","Type":"ContainerDied","Data":"665638adcb11a27f8ff8bafe52621b7118aa3543ee8a95308b547de46a712d99"} Apr 24 21:17:19.591808 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:19.591555 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:17:20.432139 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:20.432069 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:20.432139 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:20.432082 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:20.432419 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:20.432153 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:20.432419 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:20.432287 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:20.594981 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:20.594955 2569 generic.go:358] "Generic (PLEG): container finished" podID="68711042-2c0d-43ee-aac6-684c532f8d59" containerID="1abaf852a307a8dfad0c6616819aa03b83ca4397296c547b1652af7da699d5af" exitCode=0 Apr 24 21:17:20.595451 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:20.595046 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bhz9g" event={"ID":"68711042-2c0d-43ee-aac6-684c532f8d59","Type":"ContainerDied","Data":"1abaf852a307a8dfad0c6616819aa03b83ca4397296c547b1652af7da699d5af"} Apr 24 21:17:20.595451 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:20.595201 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:17:22.432651 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:22.432612 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:22.433074 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:22.432707 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:22.433074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:22.432745 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:22.433074 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:22.432837 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:23.212509 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:23.212480 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:17:23.212716 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:23.212700 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:17:23.227953 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:23.227903 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" podUID="e1e0c938-ec29-416f-8376-93b93ff2d991" containerName="ovnkube-controller" probeResult="failure" output="" Apr 24 21:17:23.236849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:23.236818 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" podUID="e1e0c938-ec29-416f-8376-93b93ff2d991" containerName="ovnkube-controller" probeResult="failure" output="" Apr 24 21:17:24.432209 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:24.432173 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:24.432586 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:24.432173 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:24.432586 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:24.432278 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:24.432586 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:24.432372 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:26.096768 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:26.096723 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs\") pod \"network-metrics-daemon-zqp7l\" (UID: \"ddd581ca-fe5d-4e33-965d-ad198f8af209\") " pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:26.097215 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:26.096861 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:26.097215 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:26.096920 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs podName:ddd581ca-fe5d-4e33-965d-ad198f8af209 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:58.096905234 +0000 UTC m=+66.231303846 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs") pod "network-metrics-daemon-zqp7l" (UID: "ddd581ca-fe5d-4e33-965d-ad198f8af209") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:26.198034 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:26.198006 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5srt\" (UniqueName: \"kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt\") pod \"network-check-target-m9nk2\" (UID: \"ac26e9c0-3977-40b9-a44e-d694b6663276\") " pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:26.198161 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:26.198125 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:17:26.198161 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:26.198137 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:17:26.198161 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:26.198146 2569 projected.go:194] Error preparing data for projected volume kube-api-access-k5srt for pod openshift-network-diagnostics/network-check-target-m9nk2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:26.198252 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:26.198188 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt podName:ac26e9c0-3977-40b9-a44e-d694b6663276 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:58.19817616 +0000 UTC m=+66.332574760 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5srt" (UniqueName: "kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt") pod "network-check-target-m9nk2" (UID: "ac26e9c0-3977-40b9-a44e-d694b6663276") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:26.432727 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:26.432704 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:26.432876 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:26.432830 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:26.432876 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:26.432863 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:26.432988 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:26.432963 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:26.610005 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:26.609969 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bhz9g" event={"ID":"68711042-2c0d-43ee-aac6-684c532f8d59","Type":"ContainerStarted","Data":"5a6e95b7e1858dd5f58fadd3b4866a2f3b9e2543608b44921443a5dda880b46f"} Apr 24 21:17:27.613355 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:27.613320 2569 generic.go:358] "Generic (PLEG): container finished" podID="68711042-2c0d-43ee-aac6-684c532f8d59" containerID="5a6e95b7e1858dd5f58fadd3b4866a2f3b9e2543608b44921443a5dda880b46f" exitCode=0 Apr 24 21:17:27.613723 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:27.613398 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bhz9g" event={"ID":"68711042-2c0d-43ee-aac6-684c532f8d59","Type":"ContainerDied","Data":"5a6e95b7e1858dd5f58fadd3b4866a2f3b9e2543608b44921443a5dda880b46f"} Apr 24 21:17:28.432314 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:28.432283 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:28.432498 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:28.432371 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:28.432498 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:28.432432 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:28.432612 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:28.432519 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:28.617210 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:28.617181 2569 generic.go:358] "Generic (PLEG): container finished" podID="68711042-2c0d-43ee-aac6-684c532f8d59" containerID="a56df48c8355464921e95031848def32b71eb55d8fd226107f8ff1e7d32173a1" exitCode=0 Apr 24 21:17:28.617549 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:28.617229 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bhz9g" event={"ID":"68711042-2c0d-43ee-aac6-684c532f8d59","Type":"ContainerDied","Data":"a56df48c8355464921e95031848def32b71eb55d8fd226107f8ff1e7d32173a1"} Apr 24 21:17:29.621420 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:29.621249 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bhz9g" event={"ID":"68711042-2c0d-43ee-aac6-684c532f8d59","Type":"ContainerStarted","Data":"1e9398e4dada3c84ca428ec04a25c6d5abd4cddfd45de485e20abac193cd99b3"} Apr 24 21:17:29.647399 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:29.647350 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bhz9g" podStartSLOduration=6.287779416 podStartE2EDuration="37.647333678s" podCreationTimestamp="2026-04-24 21:16:52 +0000 UTC" firstStartedPulling="2026-04-24 21:16:55.010592417 +0000 UTC m=+3.144991018" lastFinishedPulling="2026-04-24 21:17:26.370146681 +0000 UTC m=+34.504545280" observedRunningTime="2026-04-24 21:17:29.64586661 +0000 UTC m=+37.780265269" watchObservedRunningTime="2026-04-24 21:17:29.647333678 +0000 UTC m=+37.781732300" Apr 24 21:17:30.431732 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:30.431703 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:30.431904 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:30.431735 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:30.431904 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:30.431804 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:30.431989 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:30.431951 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:32.433319 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:32.433287 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:32.433655 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:32.433371 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:32.433655 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:32.433466 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:32.433655 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:32.433570 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:34.431695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:34.431663 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:34.432144 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:34.431795 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:34.432144 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:34.431859 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:34.432144 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:34.431931 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:36.432647 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:36.432615 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:36.433030 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:36.432625 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:36.433030 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:36.432710 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:36.433030 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:36.432805 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:38.432259 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:38.432230 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:38.432742 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:38.432267 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:38.432742 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:38.432346 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:38.432742 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:38.432512 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:38.887085 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:38.886964 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zqp7l"] Apr 24 21:17:38.887085 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:38.887087 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:38.887315 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:38.887200 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:38.888079 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:38.888058 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m9nk2"] Apr 24 21:17:38.888185 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:38.888125 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:38.888246 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:38.888194 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:40.432580 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:40.432548 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:40.433206 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:40.432554 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:40.433206 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:40.432661 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:40.433206 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:40.432721 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:42.433959 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:42.433708 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:42.434793 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:42.434734 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9nk2" podUID="ac26e9c0-3977-40b9-a44e-d694b6663276" Apr 24 21:17:42.434924 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:42.434818 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:42.436087 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:42.436057 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zqp7l" podUID="ddd581ca-fe5d-4e33-965d-ad198f8af209" Apr 24 21:17:43.226018 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.225953 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-118.ec2.internal" event="NodeReady" Apr 24 21:17:43.226142 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.226042 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:17:43.448014 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.447989 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cdfn8"] Apr 24 21:17:43.467162 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.467081 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6hhf2"] Apr 24 21:17:43.467263 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.467191 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:43.469851 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.469835 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:17:43.469952 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.469878 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:17:43.470076 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.470061 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9vgbs\"" Apr 24 21:17:43.479818 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.479778 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6hhf2" Apr 24 21:17:43.487495 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.487479 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:17:43.488018 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.488004 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:17:43.488096 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.488084 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:17:43.488273 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.488262 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6g56g\"" Apr 24 21:17:43.497945 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.497930 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cdfn8"] Apr 24 21:17:43.499159 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.499133 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8928p"] Apr 24 21:17:43.516616 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.516590 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6hhf2"] Apr 24 21:17:43.516718 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.516687 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.525437 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.525418 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:17:43.525437 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.525437 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-26gnc\"" Apr 24 21:17:43.525559 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.525452 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:17:43.525559 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.525488 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:17:43.526561 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.526545 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:17:43.530898 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.530881 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8928p"] Apr 24 21:17:43.627409 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.627382 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1de28cba-9711-40f3-affe-fd088ee9e25b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8928p\" (UID: \"1de28cba-9711-40f3-affe-fd088ee9e25b\") " pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.627409 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.627411 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p96pf\" (UniqueName: \"kubernetes.io/projected/1de28cba-9711-40f3-affe-fd088ee9e25b-kube-api-access-p96pf\") pod \"insights-runtime-extractor-8928p\" (UID: \"1de28cba-9711-40f3-affe-fd088ee9e25b\") " pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.627564 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.627433 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/65af2f93-95ea-42a0-a1bc-090aac46e966-tmp-dir\") pod \"dns-default-cdfn8\" (UID: \"65af2f93-95ea-42a0-a1bc-090aac46e966\") " pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:43.627564 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.627457 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1de28cba-9711-40f3-affe-fd088ee9e25b-crio-socket\") pod \"insights-runtime-extractor-8928p\" (UID: \"1de28cba-9711-40f3-affe-fd088ee9e25b\") " pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.627564 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.627481 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7wl5\" (UniqueName: \"kubernetes.io/projected/65af2f93-95ea-42a0-a1bc-090aac46e966-kube-api-access-k7wl5\") pod \"dns-default-cdfn8\" (UID: \"65af2f93-95ea-42a0-a1bc-090aac46e966\") " pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:43.627564 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.627500 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ab73dd3-791f-4c51-800b-67742dfd636b-cert\") pod \"ingress-canary-6hhf2\" (UID: \"3ab73dd3-791f-4c51-800b-67742dfd636b\") " pod="openshift-ingress-canary/ingress-canary-6hhf2" Apr 24 21:17:43.627564 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.627515 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1de28cba-9711-40f3-affe-fd088ee9e25b-data-volume\") pod \"insights-runtime-extractor-8928p\" (UID: \"1de28cba-9711-40f3-affe-fd088ee9e25b\") " pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.627564 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.627534 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65af2f93-95ea-42a0-a1bc-090aac46e966-metrics-tls\") pod \"dns-default-cdfn8\" (UID: \"65af2f93-95ea-42a0-a1bc-090aac46e966\") " pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:43.627804 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.627566 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1de28cba-9711-40f3-affe-fd088ee9e25b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8928p\" (UID: \"1de28cba-9711-40f3-affe-fd088ee9e25b\") " pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.627804 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.627588 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65af2f93-95ea-42a0-a1bc-090aac46e966-config-volume\") pod \"dns-default-cdfn8\" (UID: \"65af2f93-95ea-42a0-a1bc-090aac46e966\") " pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:43.627804 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.627608 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp9ts\" (UniqueName: \"kubernetes.io/projected/3ab73dd3-791f-4c51-800b-67742dfd636b-kube-api-access-jp9ts\") pod \"ingress-canary-6hhf2\" (UID: \"3ab73dd3-791f-4c51-800b-67742dfd636b\") " pod="openshift-ingress-canary/ingress-canary-6hhf2" Apr 24 21:17:43.728824 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.728799 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7wl5\" (UniqueName: \"kubernetes.io/projected/65af2f93-95ea-42a0-a1bc-090aac46e966-kube-api-access-k7wl5\") pod \"dns-default-cdfn8\" (UID: \"65af2f93-95ea-42a0-a1bc-090aac46e966\") " pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:43.728943 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.728824 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ab73dd3-791f-4c51-800b-67742dfd636b-cert\") pod \"ingress-canary-6hhf2\" (UID: \"3ab73dd3-791f-4c51-800b-67742dfd636b\") " pod="openshift-ingress-canary/ingress-canary-6hhf2" Apr 24 21:17:43.728943 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.728857 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1de28cba-9711-40f3-affe-fd088ee9e25b-data-volume\") pod \"insights-runtime-extractor-8928p\" (UID: \"1de28cba-9711-40f3-affe-fd088ee9e25b\") " pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.729069 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.729049 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65af2f93-95ea-42a0-a1bc-090aac46e966-metrics-tls\") pod \"dns-default-cdfn8\" (UID: \"65af2f93-95ea-42a0-a1bc-090aac46e966\") " pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:43.729111 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.729088 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1de28cba-9711-40f3-affe-fd088ee9e25b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8928p\" (UID: \"1de28cba-9711-40f3-affe-fd088ee9e25b\") " pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.729148 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.729117 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65af2f93-95ea-42a0-a1bc-090aac46e966-config-volume\") pod \"dns-default-cdfn8\" (UID: \"65af2f93-95ea-42a0-a1bc-090aac46e966\") " pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:43.729148 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.729133 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp9ts\" (UniqueName: \"kubernetes.io/projected/3ab73dd3-791f-4c51-800b-67742dfd636b-kube-api-access-jp9ts\") pod \"ingress-canary-6hhf2\" (UID: \"3ab73dd3-791f-4c51-800b-67742dfd636b\") " pod="openshift-ingress-canary/ingress-canary-6hhf2" Apr 24 21:17:43.729240 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.729163 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1de28cba-9711-40f3-affe-fd088ee9e25b-data-volume\") pod \"insights-runtime-extractor-8928p\" (UID: \"1de28cba-9711-40f3-affe-fd088ee9e25b\") " pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.729240 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.729172 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1de28cba-9711-40f3-affe-fd088ee9e25b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8928p\" (UID: \"1de28cba-9711-40f3-affe-fd088ee9e25b\") " pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.729240 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.729196 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p96pf\" (UniqueName: \"kubernetes.io/projected/1de28cba-9711-40f3-affe-fd088ee9e25b-kube-api-access-p96pf\") pod \"insights-runtime-extractor-8928p\" (UID: \"1de28cba-9711-40f3-affe-fd088ee9e25b\") " pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.729240 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.729230 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/65af2f93-95ea-42a0-a1bc-090aac46e966-tmp-dir\") pod \"dns-default-cdfn8\" (UID: \"65af2f93-95ea-42a0-a1bc-090aac46e966\") " pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:43.729430 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.729259 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1de28cba-9711-40f3-affe-fd088ee9e25b-crio-socket\") pod \"insights-runtime-extractor-8928p\" (UID: \"1de28cba-9711-40f3-affe-fd088ee9e25b\") " pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.730718 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.730111 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1de28cba-9711-40f3-affe-fd088ee9e25b-crio-socket\") pod \"insights-runtime-extractor-8928p\" (UID: \"1de28cba-9711-40f3-affe-fd088ee9e25b\") " pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.730718 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.730284 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65af2f93-95ea-42a0-a1bc-090aac46e966-config-volume\") pod \"dns-default-cdfn8\" (UID: \"65af2f93-95ea-42a0-a1bc-090aac46e966\") " pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:43.730718 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.730309 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/65af2f93-95ea-42a0-a1bc-090aac46e966-tmp-dir\") pod \"dns-default-cdfn8\" (UID: \"65af2f93-95ea-42a0-a1bc-090aac46e966\") " pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:43.733496 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.731393 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1de28cba-9711-40f3-affe-fd088ee9e25b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8928p\" (UID: \"1de28cba-9711-40f3-affe-fd088ee9e25b\") " pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.734307 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.734288 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1de28cba-9711-40f3-affe-fd088ee9e25b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8928p\" (UID: \"1de28cba-9711-40f3-affe-fd088ee9e25b\") " pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.734409 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.734337 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ab73dd3-791f-4c51-800b-67742dfd636b-cert\") pod \"ingress-canary-6hhf2\" (UID: \"3ab73dd3-791f-4c51-800b-67742dfd636b\") " pod="openshift-ingress-canary/ingress-canary-6hhf2" Apr 24 21:17:43.734813 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.734796 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65af2f93-95ea-42a0-a1bc-090aac46e966-metrics-tls\") pod \"dns-default-cdfn8\" (UID: \"65af2f93-95ea-42a0-a1bc-090aac46e966\") " pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:43.755979 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.755957 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p96pf\" (UniqueName: \"kubernetes.io/projected/1de28cba-9711-40f3-affe-fd088ee9e25b-kube-api-access-p96pf\") pod \"insights-runtime-extractor-8928p\" (UID: \"1de28cba-9711-40f3-affe-fd088ee9e25b\") " pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.758748 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.758717 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp9ts\" (UniqueName: \"kubernetes.io/projected/3ab73dd3-791f-4c51-800b-67742dfd636b-kube-api-access-jp9ts\") pod \"ingress-canary-6hhf2\" (UID: \"3ab73dd3-791f-4c51-800b-67742dfd636b\") " pod="openshift-ingress-canary/ingress-canary-6hhf2" Apr 24 21:17:43.759186 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.759155 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7wl5\" (UniqueName: \"kubernetes.io/projected/65af2f93-95ea-42a0-a1bc-090aac46e966-kube-api-access-k7wl5\") pod \"dns-default-cdfn8\" (UID: \"65af2f93-95ea-42a0-a1bc-090aac46e966\") " pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:43.776147 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.776125 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:43.786710 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.786691 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6hhf2" Apr 24 21:17:43.824673 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.824592 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8928p" Apr 24 21:17:43.940937 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.940909 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cdfn8"] Apr 24 21:17:43.945427 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:17:43.945399 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65af2f93_95ea_42a0_a1bc_090aac46e966.slice/crio-6636cb2d4dc4869adac4a9ed981f74d3cabea42b53da9cab4bc814a1db4f3382 WatchSource:0}: Error finding container 6636cb2d4dc4869adac4a9ed981f74d3cabea42b53da9cab4bc814a1db4f3382: Status 404 returned error can't find the container with id 6636cb2d4dc4869adac4a9ed981f74d3cabea42b53da9cab4bc814a1db4f3382 Apr 24 21:17:43.955069 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.955043 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6hhf2"] Apr 24 21:17:43.962583 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:17:43.961423 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ab73dd3_791f_4c51_800b_67742dfd636b.slice/crio-86852e96109cd5a5733f0cf4e83a2d9daf7446f8c6263e408062bbee2e04bf9e WatchSource:0}: Error finding container 86852e96109cd5a5733f0cf4e83a2d9daf7446f8c6263e408062bbee2e04bf9e: Status 404 returned error can't find the container with id 86852e96109cd5a5733f0cf4e83a2d9daf7446f8c6263e408062bbee2e04bf9e Apr 24 21:17:43.977426 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:43.977398 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8928p"] Apr 24 21:17:43.982653 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:17:43.982625 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1de28cba_9711_40f3_affe_fd088ee9e25b.slice/crio-eb674c0854ec144a7dccd3fa5e00ab37553b760476e6fe4c04d0392cc028db1f WatchSource:0}: Error finding container eb674c0854ec144a7dccd3fa5e00ab37553b760476e6fe4c04d0392cc028db1f: Status 404 returned error can't find the container with id eb674c0854ec144a7dccd3fa5e00ab37553b760476e6fe4c04d0392cc028db1f Apr 24 21:17:44.432386 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:44.432166 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:44.432555 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:44.432166 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:44.439590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:44.439196 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:17:44.439590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:44.439230 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6pwcs\"" Apr 24 21:17:44.439590 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:44.439458 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:17:44.440356 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:44.440336 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gjqgx\"" Apr 24 21:17:44.440448 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:44.440369 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:17:44.648585 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:44.648550 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8928p" event={"ID":"1de28cba-9711-40f3-affe-fd088ee9e25b","Type":"ContainerStarted","Data":"7e846a204901ac50c9fb793a71c428d4722d85b2c2d2e4c86cbae1751fbda168"} Apr 24 21:17:44.648999 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:44.648593 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8928p" event={"ID":"1de28cba-9711-40f3-affe-fd088ee9e25b","Type":"ContainerStarted","Data":"eb674c0854ec144a7dccd3fa5e00ab37553b760476e6fe4c04d0392cc028db1f"} Apr 24 21:17:44.650229 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:44.650205 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6hhf2" event={"ID":"3ab73dd3-791f-4c51-800b-67742dfd636b","Type":"ContainerStarted","Data":"86852e96109cd5a5733f0cf4e83a2d9daf7446f8c6263e408062bbee2e04bf9e"} Apr 24 21:17:44.651501 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:44.651476 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cdfn8" event={"ID":"65af2f93-95ea-42a0-a1bc-090aac46e966","Type":"ContainerStarted","Data":"6636cb2d4dc4869adac4a9ed981f74d3cabea42b53da9cab4bc814a1db4f3382"} Apr 24 21:17:45.655565 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:45.655525 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8928p" event={"ID":"1de28cba-9711-40f3-affe-fd088ee9e25b","Type":"ContainerStarted","Data":"3462a596275df5fce9d8df90ab11cf25c3942cc8469e9d8b25a658a64dddc1ac"} Apr 24 21:17:46.659821 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.659786 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8928p" event={"ID":"1de28cba-9711-40f3-affe-fd088ee9e25b","Type":"ContainerStarted","Data":"661d22dc2b101841559894a726b15042533de24778c135e64fb934b27d7ba04d"} Apr 24 21:17:46.661150 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.661129 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6hhf2" event={"ID":"3ab73dd3-791f-4c51-800b-67742dfd636b","Type":"ContainerStarted","Data":"b64ce83a485624ad0eaf6698920211c9674c5aebb9294c143ec821bb6b59e87d"} Apr 24 21:17:46.662456 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.662437 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cdfn8" event={"ID":"65af2f93-95ea-42a0-a1bc-090aac46e966","Type":"ContainerStarted","Data":"761f0d4310432a12cb2592465349a782480b0ea2d825b88cce8c65f810ebae8c"} Apr 24 21:17:46.662527 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.662461 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cdfn8" event={"ID":"65af2f93-95ea-42a0-a1bc-090aac46e966","Type":"ContainerStarted","Data":"a9b9f0d920b8dd8b91b73f48797da1a6833a32748adabda9b8c0b55972fe5c52"} Apr 24 21:17:46.662569 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.662559 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:46.703184 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.703139 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8928p" podStartSLOduration=1.409729307 podStartE2EDuration="3.70312901s" podCreationTimestamp="2026-04-24 21:17:43 +0000 UTC" firstStartedPulling="2026-04-24 21:17:44.064476089 +0000 UTC m=+52.198874687" lastFinishedPulling="2026-04-24 21:17:46.357875786 +0000 UTC m=+54.492274390" observedRunningTime="2026-04-24 21:17:46.702680789 +0000 UTC m=+54.837079420" watchObservedRunningTime="2026-04-24 21:17:46.70312901 +0000 UTC m=+54.837527630" Apr 24 21:17:46.757337 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.757295 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cdfn8" podStartSLOduration=1.349715265 podStartE2EDuration="3.75728467s" podCreationTimestamp="2026-04-24 21:17:43 +0000 UTC" firstStartedPulling="2026-04-24 21:17:43.946782123 +0000 UTC m=+52.081180722" lastFinishedPulling="2026-04-24 21:17:46.35435151 +0000 UTC m=+54.488750127" observedRunningTime="2026-04-24 21:17:46.756555899 +0000 UTC m=+54.890954525" watchObservedRunningTime="2026-04-24 21:17:46.75728467 +0000 UTC m=+54.891683287" Apr 24 21:17:46.786252 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.786218 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6hhf2" podStartSLOduration=1.3946537110000001 podStartE2EDuration="3.786207961s" podCreationTimestamp="2026-04-24 21:17:43 +0000 UTC" firstStartedPulling="2026-04-24 21:17:43.964358521 +0000 UTC m=+52.098757120" lastFinishedPulling="2026-04-24 21:17:46.355912764 +0000 UTC m=+54.490311370" observedRunningTime="2026-04-24 21:17:46.784919436 +0000 UTC m=+54.919318057" watchObservedRunningTime="2026-04-24 21:17:46.786207961 +0000 UTC m=+54.920606582" Apr 24 21:17:46.824858 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.824834 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw"] Apr 24 21:17:46.827704 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.827691 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" Apr 24 21:17:46.831996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.831977 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 21:17:46.832270 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.832255 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:17:46.832528 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.832515 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:17:46.832739 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.832725 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:17:46.832908 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.832897 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:17:46.837224 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.837192 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-w6nxg\"" Apr 24 21:17:46.844895 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.844853 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw"] Apr 24 21:17:46.848967 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.848949 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-z9gbt"] Apr 24 21:17:46.872098 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.871014 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mmt7h"] Apr 24 21:17:46.874040 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.874015 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:46.874040 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.874033 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:46.875267 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.875248 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mmt7h"] Apr 24 21:17:46.877557 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.877532 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 21:17:46.877713 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.877694 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:17:46.877808 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.877693 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:17:46.878217 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.877973 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:17:46.878217 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.877982 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-q86tw\"" Apr 24 21:17:46.878217 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.878204 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:17:46.878388 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.878377 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fm79n\"" Apr 24 21:17:46.878853 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.878835 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 21:17:46.952508 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.952481 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1111e14-fede-42ce-8a23-a8d08526e8c6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dzbsw\" (UID: \"b1111e14-fede-42ce-8a23-a8d08526e8c6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" Apr 24 21:17:46.952595 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.952511 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc2hh\" (UniqueName: \"kubernetes.io/projected/b1111e14-fede-42ce-8a23-a8d08526e8c6-kube-api-access-hc2hh\") pod \"openshift-state-metrics-9d44df66c-dzbsw\" (UID: \"b1111e14-fede-42ce-8a23-a8d08526e8c6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" Apr 24 21:17:46.952595 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.952535 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1111e14-fede-42ce-8a23-a8d08526e8c6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-dzbsw\" (UID: \"b1111e14-fede-42ce-8a23-a8d08526e8c6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" Apr 24 21:17:46.952681 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:46.952630 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1111e14-fede-42ce-8a23-a8d08526e8c6-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-dzbsw\" (UID: \"b1111e14-fede-42ce-8a23-a8d08526e8c6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" Apr 24 21:17:47.053699 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.053673 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fdgm\" (UniqueName: \"kubernetes.io/projected/1352a5df-0a6a-4562-ae3d-1061283310eb-kube-api-access-7fdgm\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.053816 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.053703 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-textfile\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.053816 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.053727 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1111e14-fede-42ce-8a23-a8d08526e8c6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-dzbsw\" (UID: \"b1111e14-fede-42ce-8a23-a8d08526e8c6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" Apr 24 21:17:47.053816 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.053777 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7ba14ffd-52bc-443a-827d-b237a10721e1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.053816 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.053814 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ba14ffd-52bc-443a-827d-b237a10721e1-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.053958 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.053831 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ba14ffd-52bc-443a-827d-b237a10721e1-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.053958 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.053862 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1352a5df-0a6a-4562-ae3d-1061283310eb-sys\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.053958 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.053882 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7ba14ffd-52bc-443a-827d-b237a10721e1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.053958 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.053911 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-wtmp\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.054094 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.053956 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1111e14-fede-42ce-8a23-a8d08526e8c6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dzbsw\" (UID: \"b1111e14-fede-42ce-8a23-a8d08526e8c6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" Apr 24 21:17:47.054094 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.053980 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hc2hh\" (UniqueName: \"kubernetes.io/projected/b1111e14-fede-42ce-8a23-a8d08526e8c6-kube-api-access-hc2hh\") pod \"openshift-state-metrics-9d44df66c-dzbsw\" (UID: \"b1111e14-fede-42ce-8a23-a8d08526e8c6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" Apr 24 21:17:47.054094 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.054001 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.054094 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.054020 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7ba14ffd-52bc-443a-827d-b237a10721e1-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.054094 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.054038 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-tls\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.054094 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.054069 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1352a5df-0a6a-4562-ae3d-1061283310eb-metrics-client-ca\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.054301 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.054103 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1352a5df-0a6a-4562-ae3d-1061283310eb-root\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.054301 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.054159 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1111e14-fede-42ce-8a23-a8d08526e8c6-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-dzbsw\" (UID: \"b1111e14-fede-42ce-8a23-a8d08526e8c6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" Apr 24 21:17:47.054301 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.054194 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-accelerators-collector-config\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.054301 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.054220 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s87b\" (UniqueName: \"kubernetes.io/projected/7ba14ffd-52bc-443a-827d-b237a10721e1-kube-api-access-5s87b\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.055250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.055231 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1111e14-fede-42ce-8a23-a8d08526e8c6-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-dzbsw\" (UID: \"b1111e14-fede-42ce-8a23-a8d08526e8c6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" Apr 24 21:17:47.057310 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.057285 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1111e14-fede-42ce-8a23-a8d08526e8c6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-dzbsw\" (UID: \"b1111e14-fede-42ce-8a23-a8d08526e8c6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" Apr 24 21:17:47.057429 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.057412 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1111e14-fede-42ce-8a23-a8d08526e8c6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dzbsw\" (UID: \"b1111e14-fede-42ce-8a23-a8d08526e8c6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" Apr 24 21:17:47.063136 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.063116 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc2hh\" (UniqueName: \"kubernetes.io/projected/b1111e14-fede-42ce-8a23-a8d08526e8c6-kube-api-access-hc2hh\") pod \"openshift-state-metrics-9d44df66c-dzbsw\" (UID: \"b1111e14-fede-42ce-8a23-a8d08526e8c6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" Apr 24 21:17:47.137908 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.137856 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" Apr 24 21:17:47.154603 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.154582 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-accelerators-collector-config\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.154671 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.154612 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s87b\" (UniqueName: \"kubernetes.io/projected/7ba14ffd-52bc-443a-827d-b237a10721e1-kube-api-access-5s87b\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.154671 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.154631 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fdgm\" (UniqueName: \"kubernetes.io/projected/1352a5df-0a6a-4562-ae3d-1061283310eb-kube-api-access-7fdgm\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.154671 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.154648 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-textfile\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.154827 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.154676 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7ba14ffd-52bc-443a-827d-b237a10721e1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.154827 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.154717 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ba14ffd-52bc-443a-827d-b237a10721e1-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.154827 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.154769 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ba14ffd-52bc-443a-827d-b237a10721e1-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.154827 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.154801 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1352a5df-0a6a-4562-ae3d-1061283310eb-sys\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.154966 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.154831 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7ba14ffd-52bc-443a-827d-b237a10721e1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.154966 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.154854 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-wtmp\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.154966 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.154885 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.154966 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.154908 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7ba14ffd-52bc-443a-827d-b237a10721e1-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.154966 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.154934 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-tls\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.155196 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:47.154949 2569 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 24 21:17:47.155196 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.154978 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1352a5df-0a6a-4562-ae3d-1061283310eb-metrics-client-ca\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.155196 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.155007 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1352a5df-0a6a-4562-ae3d-1061283310eb-root\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.155196 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:47.155030 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ba14ffd-52bc-443a-827d-b237a10721e1-kube-state-metrics-tls podName:7ba14ffd-52bc-443a-827d-b237a10721e1 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:47.655004224 +0000 UTC m=+55.789402828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/7ba14ffd-52bc-443a-827d-b237a10721e1-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-mmt7h" (UID: "7ba14ffd-52bc-443a-827d-b237a10721e1") : secret "kube-state-metrics-tls" not found Apr 24 21:17:47.155196 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.155054 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1352a5df-0a6a-4562-ae3d-1061283310eb-root\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.155196 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.155064 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-wtmp\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.155453 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:47.155238 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:17:47.155453 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:47.155296 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-tls podName:1352a5df-0a6a-4562-ae3d-1061283310eb nodeName:}" failed. No retries permitted until 2026-04-24 21:17:47.65528086 +0000 UTC m=+55.789679475 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-tls") pod "node-exporter-z9gbt" (UID: "1352a5df-0a6a-4562-ae3d-1061283310eb") : secret "node-exporter-tls" not found Apr 24 21:17:47.155453 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.155333 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1352a5df-0a6a-4562-ae3d-1061283310eb-sys\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.155453 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.155359 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7ba14ffd-52bc-443a-827d-b237a10721e1-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.155929 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.155696 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ba14ffd-52bc-443a-827d-b237a10721e1-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.155929 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.155786 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-textfile\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.156038 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.155922 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-accelerators-collector-config\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.156038 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.155922 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7ba14ffd-52bc-443a-827d-b237a10721e1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.156038 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.155990 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1352a5df-0a6a-4562-ae3d-1061283310eb-metrics-client-ca\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.157442 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.157422 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7ba14ffd-52bc-443a-827d-b237a10721e1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.157540 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.157524 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.165532 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.165512 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fdgm\" (UniqueName: \"kubernetes.io/projected/1352a5df-0a6a-4562-ae3d-1061283310eb-kube-api-access-7fdgm\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.165831 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.165812 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s87b\" (UniqueName: \"kubernetes.io/projected/7ba14ffd-52bc-443a-827d-b237a10721e1-kube-api-access-5s87b\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.256373 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.256342 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw"] Apr 24 21:17:47.261048 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:17:47.261022 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1111e14_fede_42ce_8a23_a8d08526e8c6.slice/crio-b0492cc79a27d77ba108f45f615a21bd59bbdd621c3ca8446fe6cc032844d2ea WatchSource:0}: Error finding container b0492cc79a27d77ba108f45f615a21bd59bbdd621c3ca8446fe6cc032844d2ea: Status 404 returned error can't find the container with id b0492cc79a27d77ba108f45f615a21bd59bbdd621c3ca8446fe6cc032844d2ea Apr 24 21:17:47.658976 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.658945 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ba14ffd-52bc-443a-827d-b237a10721e1-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.659116 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.658999 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-tls\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.661093 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.661076 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1352a5df-0a6a-4562-ae3d-1061283310eb-node-exporter-tls\") pod \"node-exporter-z9gbt\" (UID: \"1352a5df-0a6a-4562-ae3d-1061283310eb\") " pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.661370 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.661136 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ba14ffd-52bc-443a-827d-b237a10721e1-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mmt7h\" (UID: \"7ba14ffd-52bc-443a-827d-b237a10721e1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.667209 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.667178 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" event={"ID":"b1111e14-fede-42ce-8a23-a8d08526e8c6","Type":"ContainerStarted","Data":"4bf4c04e539e5df701935cd4f37237e51e2dfa79a89765d5bc7669d860ee581a"} Apr 24 21:17:47.667321 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.667222 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" event={"ID":"b1111e14-fede-42ce-8a23-a8d08526e8c6","Type":"ContainerStarted","Data":"f4161cac789f0b571ec5e60b3f6700d2df1545c273b302cbb7b7d4dd314e672c"} Apr 24 21:17:47.667321 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.667237 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" event={"ID":"b1111e14-fede-42ce-8a23-a8d08526e8c6","Type":"ContainerStarted","Data":"b0492cc79a27d77ba108f45f615a21bd59bbdd621c3ca8446fe6cc032844d2ea"} Apr 24 21:17:47.784645 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.784620 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" Apr 24 21:17:47.789582 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.789375 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-z9gbt" Apr 24 21:17:47.799615 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.799461 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:17:47.804261 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:17:47.804233 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1352a5df_0a6a_4562_ae3d_1061283310eb.slice/crio-c0910cfe0ca8ee1104e8ebee64a9a2bd4ddceb5f21928313b96276fd65c918aa WatchSource:0}: Error finding container c0910cfe0ca8ee1104e8ebee64a9a2bd4ddceb5f21928313b96276fd65c918aa: Status 404 returned error can't find the container with id c0910cfe0ca8ee1104e8ebee64a9a2bd4ddceb5f21928313b96276fd65c918aa Apr 24 21:17:47.805542 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.805524 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:47.808915 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.808894 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:17:47.809165 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.809148 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:17:47.809243 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.809230 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-j2f79\"" Apr 24 21:17:47.809408 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.809387 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:17:47.809505 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.809459 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:17:47.809586 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.809571 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:17:47.810137 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.809780 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:17:47.810137 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.809944 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:17:47.810137 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.809961 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:17:47.810137 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.810070 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:17:47.817928 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.817906 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:17:47.918563 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.918493 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mmt7h"] Apr 24 21:17:47.922133 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:17:47.922108 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ba14ffd_52bc_443a_827d_b237a10721e1.slice/crio-574893ff8ea6baea981792e57f7f9ad65181649b999e481234c3138e0e794201 WatchSource:0}: Error finding container 574893ff8ea6baea981792e57f7f9ad65181649b999e481234c3138e0e794201: Status 404 returned error can't find the container with id 574893ff8ea6baea981792e57f7f9ad65181649b999e481234c3138e0e794201 Apr 24 21:17:47.962442 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.962389 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:47.962442 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.962433 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwpsc\" (UniqueName: \"kubernetes.io/projected/99c697a7-950b-4e94-b800-4425d568df3f-kube-api-access-qwpsc\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:47.962585 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.962463 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/99c697a7-950b-4e94-b800-4425d568df3f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:47.962951 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.962902 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:47.963174 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.963119 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:47.963281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.963182 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99c697a7-950b-4e94-b800-4425d568df3f-config-out\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:47.963281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.963223 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99c697a7-950b-4e94-b800-4425d568df3f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:47.963281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.963261 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-web-config\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:47.963433 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.963297 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:47.963433 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.963326 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-config-volume\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:47.963433 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.963358 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:47.963563 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.963479 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99c697a7-950b-4e94-b800-4425d568df3f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:47.963563 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:47.963511 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c697a7-950b-4e94-b800-4425d568df3f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.064124 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.064093 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.064283 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.064136 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.064283 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.064175 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99c697a7-950b-4e94-b800-4425d568df3f-config-out\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.064283 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.064203 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99c697a7-950b-4e94-b800-4425d568df3f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.064438 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.064343 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-web-config\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.064438 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.064389 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.064438 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.064418 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-config-volume\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.064652 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.064447 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.064652 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.064502 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99c697a7-950b-4e94-b800-4425d568df3f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.064652 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.064534 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c697a7-950b-4e94-b800-4425d568df3f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.065245 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.064940 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99c697a7-950b-4e94-b800-4425d568df3f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.065699 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.065371 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c697a7-950b-4e94-b800-4425d568df3f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.067612 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.067184 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.067612 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.067335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.067612 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.067342 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99c697a7-950b-4e94-b800-4425d568df3f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.067612 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.067385 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwpsc\" (UniqueName: \"kubernetes.io/projected/99c697a7-950b-4e94-b800-4425d568df3f-kube-api-access-qwpsc\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.067612 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.067417 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/99c697a7-950b-4e94-b800-4425d568df3f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.067928 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.067679 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.067928 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.067729 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-web-config\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.068034 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.067945 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/99c697a7-950b-4e94-b800-4425d568df3f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.068141 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.068113 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.068566 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.068542 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99c697a7-950b-4e94-b800-4425d568df3f-config-out\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.068655 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.068569 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-config-volume\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.069334 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.069313 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.069725 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.069704 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.076789 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.076771 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwpsc\" (UniqueName: \"kubernetes.io/projected/99c697a7-950b-4e94-b800-4425d568df3f-kube-api-access-qwpsc\") pod \"alertmanager-main-0\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.125602 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.125575 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:17:48.323574 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.323549 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:17:48.327033 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:17:48.327000 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99c697a7_950b_4e94_b800_4425d568df3f.slice/crio-c359f6115ba9d2c461ae37b9975ae02d15fec512e27874c0760bc4f0d70f4444 WatchSource:0}: Error finding container c359f6115ba9d2c461ae37b9975ae02d15fec512e27874c0760bc4f0d70f4444: Status 404 returned error can't find the container with id c359f6115ba9d2c461ae37b9975ae02d15fec512e27874c0760bc4f0d70f4444 Apr 24 21:17:48.673401 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.673357 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" event={"ID":"b1111e14-fede-42ce-8a23-a8d08526e8c6","Type":"ContainerStarted","Data":"d82dc446eceace46717ea42a6636b95d642ac92259c29b8d43cad56483f44ab4"} Apr 24 21:17:48.676015 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.675986 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"99c697a7-950b-4e94-b800-4425d568df3f","Type":"ContainerStarted","Data":"c359f6115ba9d2c461ae37b9975ae02d15fec512e27874c0760bc4f0d70f4444"} Apr 24 21:17:48.677818 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.677786 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z9gbt" event={"ID":"1352a5df-0a6a-4562-ae3d-1061283310eb","Type":"ContainerStarted","Data":"c0910cfe0ca8ee1104e8ebee64a9a2bd4ddceb5f21928313b96276fd65c918aa"} Apr 24 21:17:48.678796 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.678773 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" event={"ID":"7ba14ffd-52bc-443a-827d-b237a10721e1","Type":"ContainerStarted","Data":"574893ff8ea6baea981792e57f7f9ad65181649b999e481234c3138e0e794201"} Apr 24 21:17:48.698519 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.698475 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dzbsw" podStartSLOduration=1.802819011 podStartE2EDuration="2.698459411s" podCreationTimestamp="2026-04-24 21:17:46 +0000 UTC" firstStartedPulling="2026-04-24 21:17:47.364027655 +0000 UTC m=+55.498426254" lastFinishedPulling="2026-04-24 21:17:48.259668036 +0000 UTC m=+56.394066654" observedRunningTime="2026-04-24 21:17:48.697392623 +0000 UTC m=+56.831791254" watchObservedRunningTime="2026-04-24 21:17:48.698459411 +0000 UTC m=+56.832858034" Apr 24 21:17:48.725839 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.725813 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg"] Apr 24 21:17:48.731109 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.731091 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.735522 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.735132 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 21:17:48.735522 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.735381 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-nv2f5\"" Apr 24 21:17:48.735878 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.735858 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 21:17:48.735959 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.735890 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 21:17:48.736554 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.736433 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-a6kgiicavkghs\"" Apr 24 21:17:48.736554 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.736461 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 21:17:48.739660 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.739641 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 21:17:48.750284 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.750245 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg"] Apr 24 21:17:48.772359 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.772334 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.772461 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.772393 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.772461 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.772424 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgnt7\" (UniqueName: \"kubernetes.io/projected/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-kube-api-access-sgnt7\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.772575 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.772461 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.772575 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.772532 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.772676 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.772575 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-grpc-tls\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.772676 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.772619 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-thanos-querier-tls\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.772676 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.772644 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-metrics-client-ca\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.873275 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.873241 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.873438 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.873296 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.873438 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.873328 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgnt7\" (UniqueName: \"kubernetes.io/projected/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-kube-api-access-sgnt7\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.873438 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.873370 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.873438 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.873420 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.873640 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.873452 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-grpc-tls\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.873640 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.873495 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-thanos-querier-tls\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.873640 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.873519 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-metrics-client-ca\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.874523 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.874320 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-metrics-client-ca\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.876809 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.876783 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.877028 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.877003 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.877907 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.877856 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-thanos-querier-tls\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.878103 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.878077 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.878276 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.878258 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-grpc-tls\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.879208 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.879158 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:48.884470 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:48.884418 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgnt7\" (UniqueName: \"kubernetes.io/projected/dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa-kube-api-access-sgnt7\") pod \"thanos-querier-5b99dfb4fb-nzvxg\" (UID: \"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa\") " pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:49.041931 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:49.041905 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:49.686073 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:49.686032 2569 generic.go:358] "Generic (PLEG): container finished" podID="1352a5df-0a6a-4562-ae3d-1061283310eb" containerID="850d54672d1a9955ef8c9e9fdbe0c06de7ee6905ca05d8a1b6b4d7d0420fdbb7" exitCode=0 Apr 24 21:17:49.686497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:49.686117 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z9gbt" event={"ID":"1352a5df-0a6a-4562-ae3d-1061283310eb","Type":"ContainerDied","Data":"850d54672d1a9955ef8c9e9fdbe0c06de7ee6905ca05d8a1b6b4d7d0420fdbb7"} Apr 24 21:17:50.045334 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.045283 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg"] Apr 24 21:17:50.050289 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:17:50.050261 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfc47c98_7bc9_4cf3_83b3_d0ea02ed0bfa.slice/crio-a8d35c3ca7f4f35584ed682532b1ee351b891b053df63ce1459761b13856c328 WatchSource:0}: Error finding container a8d35c3ca7f4f35584ed682532b1ee351b891b053df63ce1459761b13856c328: Status 404 returned error can't find the container with id a8d35c3ca7f4f35584ed682532b1ee351b891b053df63ce1459761b13856c328 Apr 24 21:17:50.690913 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.690876 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" event={"ID":"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa","Type":"ContainerStarted","Data":"a8d35c3ca7f4f35584ed682532b1ee351b891b053df63ce1459761b13856c328"} Apr 24 21:17:50.692307 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.692282 2569 generic.go:358] "Generic (PLEG): container finished" podID="99c697a7-950b-4e94-b800-4425d568df3f" containerID="7cf70100a50031dc7869929184bff8cce72f5cb83fcca8fbc853308a711bb1f5" exitCode=0 Apr 24 21:17:50.692421 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.692357 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"99c697a7-950b-4e94-b800-4425d568df3f","Type":"ContainerDied","Data":"7cf70100a50031dc7869929184bff8cce72f5cb83fcca8fbc853308a711bb1f5"} Apr 24 21:17:50.694521 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.694480 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z9gbt" event={"ID":"1352a5df-0a6a-4562-ae3d-1061283310eb","Type":"ContainerStarted","Data":"71d7ad4cddc43e030f612193e49540b0afd94e65ead783d471c584ae5a180e74"} Apr 24 21:17:50.694521 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.694510 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z9gbt" event={"ID":"1352a5df-0a6a-4562-ae3d-1061283310eb","Type":"ContainerStarted","Data":"5becb2352ac96e5a2abdaa79927debba4686a63b98a60eaf157c75e62d548cc1"} Apr 24 21:17:50.696586 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.696562 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" event={"ID":"7ba14ffd-52bc-443a-827d-b237a10721e1","Type":"ContainerStarted","Data":"7779464c7b0baca4972f8241ecfdb5a2baa498ab7e8467f3112d34e2be5bf58f"} Apr 24 21:17:50.696702 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.696594 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" event={"ID":"7ba14ffd-52bc-443a-827d-b237a10721e1","Type":"ContainerStarted","Data":"2bfe523902c033b2a4b293671d125c25b2b8d881e9b2bccc48b2c7f612615b40"} Apr 24 21:17:50.696702 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.696608 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" event={"ID":"7ba14ffd-52bc-443a-827d-b237a10721e1","Type":"ContainerStarted","Data":"4c7e8ad19c50f65e92567d35ae76e3e026e91da192d8ef008d3bb0c43efc148f"} Apr 24 21:17:50.734686 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.734632 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-mmt7h" podStartSLOduration=2.736454015 podStartE2EDuration="4.734617192s" podCreationTimestamp="2026-04-24 21:17:46 +0000 UTC" firstStartedPulling="2026-04-24 21:17:47.924185038 +0000 UTC m=+56.058583636" lastFinishedPulling="2026-04-24 21:17:49.922348208 +0000 UTC m=+58.056746813" observedRunningTime="2026-04-24 21:17:50.733387171 +0000 UTC m=+58.867785803" watchObservedRunningTime="2026-04-24 21:17:50.734617192 +0000 UTC m=+58.869015815" Apr 24 21:17:50.763509 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.763461 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-z9gbt" podStartSLOduration=3.890500094 podStartE2EDuration="4.763448897s" podCreationTimestamp="2026-04-24 21:17:46 +0000 UTC" firstStartedPulling="2026-04-24 21:17:47.806505916 +0000 UTC m=+55.940904529" lastFinishedPulling="2026-04-24 21:17:48.679454733 +0000 UTC m=+56.813853332" observedRunningTime="2026-04-24 21:17:50.762221107 +0000 UTC m=+58.896619745" watchObservedRunningTime="2026-04-24 21:17:50.763448897 +0000 UTC m=+58.897847519" Apr 24 21:17:50.765479 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.765458 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b478d99cb-nn6bh"] Apr 24 21:17:50.768712 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.768693 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.771788 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.771662 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:17:50.771788 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.771678 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-sxvx6\"" Apr 24 21:17:50.771788 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.771743 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:17:50.771788 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.771772 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:17:50.771788 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.771695 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:17:50.772035 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.771862 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:17:50.772035 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.771968 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:17:50.772035 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.771988 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:17:50.779932 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.779873 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b478d99cb-nn6bh"] Apr 24 21:17:50.781723 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.781704 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 21:17:50.788185 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.787499 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/133b2c9a-3e65-4237-98e8-b010b89f5025-console-serving-cert\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.788185 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.787809 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/133b2c9a-3e65-4237-98e8-b010b89f5025-console-oauth-config\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.788185 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.788136 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-console-config\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.788185 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.788168 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw4p9\" (UniqueName: \"kubernetes.io/projected/133b2c9a-3e65-4237-98e8-b010b89f5025-kube-api-access-rw4p9\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.788417 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.788200 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-oauth-serving-cert\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.788417 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.788248 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-trusted-ca-bundle\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.788517 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.788469 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-service-ca\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.889993 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.889954 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/133b2c9a-3e65-4237-98e8-b010b89f5025-console-serving-cert\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.889993 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.889997 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/133b2c9a-3e65-4237-98e8-b010b89f5025-console-oauth-config\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.890203 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.890038 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-console-config\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.890203 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.890061 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rw4p9\" (UniqueName: \"kubernetes.io/projected/133b2c9a-3e65-4237-98e8-b010b89f5025-kube-api-access-rw4p9\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.890203 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.890090 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-oauth-serving-cert\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.890487 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.890456 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-trusted-ca-bundle\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.890606 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.890527 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-service-ca\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.890923 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.890875 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-console-config\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.891039 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.890950 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-oauth-serving-cert\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.891261 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.891238 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-service-ca\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.891446 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.891405 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-trusted-ca-bundle\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.893261 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.893241 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/133b2c9a-3e65-4237-98e8-b010b89f5025-console-oauth-config\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.893736 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.893717 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/133b2c9a-3e65-4237-98e8-b010b89f5025-console-serving-cert\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:50.898586 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:50.898563 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw4p9\" (UniqueName: \"kubernetes.io/projected/133b2c9a-3e65-4237-98e8-b010b89f5025-kube-api-access-rw4p9\") pod \"console-5b478d99cb-nn6bh\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:51.084585 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.084553 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:17:51.220224 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.220185 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b478d99cb-nn6bh"] Apr 24 21:17:51.328115 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.327966 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5c68b88bf5-lpq5h"] Apr 24 21:17:51.332603 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.332580 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.336182 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.335600 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 21:17:51.336182 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.335662 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:17:51.336182 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.335683 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 21:17:51.336182 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.335713 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-k2xgt\"" Apr 24 21:17:51.336182 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.335809 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-eq5ugg44iqqbt\"" Apr 24 21:17:51.336182 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.336021 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 21:17:51.343410 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.343387 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5c68b88bf5-lpq5h"] Apr 24 21:17:51.370372 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.370348 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-2dmnr"] Apr 24 21:17:51.373748 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.373728 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2dmnr" Apr 24 21:17:51.376593 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.376464 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-nbcrt\"" Apr 24 21:17:51.376593 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.376470 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 21:17:51.389894 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.389871 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-2dmnr"] Apr 24 21:17:51.394580 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.394527 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-client-ca-bundle\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.394580 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.394568 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.394679 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.394600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-audit-log\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.394679 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.394630 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfl8q\" (UniqueName: \"kubernetes.io/projected/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-kube-api-access-wfl8q\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.394791 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.394702 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-secret-metrics-server-client-certs\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.394791 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.394749 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-metrics-server-audit-profiles\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.394869 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.394809 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ddb507d-944f-4cf5-8f38-319dacf0c8b1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2dmnr\" (UID: \"8ddb507d-944f-4cf5-8f38-319dacf0c8b1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2dmnr" Apr 24 21:17:51.394869 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.394838 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-secret-metrics-server-tls\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.495841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.495814 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-client-ca-bundle\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.496021 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.495853 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.496021 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.495884 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-audit-log\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.496021 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.495902 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfl8q\" (UniqueName: \"kubernetes.io/projected/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-kube-api-access-wfl8q\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.496021 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.495933 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-secret-metrics-server-client-certs\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.496021 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.495953 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-metrics-server-audit-profiles\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.496021 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.495988 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ddb507d-944f-4cf5-8f38-319dacf0c8b1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2dmnr\" (UID: \"8ddb507d-944f-4cf5-8f38-319dacf0c8b1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2dmnr" Apr 24 21:17:51.496021 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.496008 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-secret-metrics-server-tls\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.498123 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:51.496584 2569 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 24 21:17:51.498123 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:17:51.496653 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ddb507d-944f-4cf5-8f38-319dacf0c8b1-monitoring-plugin-cert podName:8ddb507d-944f-4cf5-8f38-319dacf0c8b1 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:51.996632241 +0000 UTC m=+60.131030840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/8ddb507d-944f-4cf5-8f38-319dacf0c8b1-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-2dmnr" (UID: "8ddb507d-944f-4cf5-8f38-319dacf0c8b1") : secret "monitoring-plugin-cert" not found Apr 24 21:17:51.498123 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.497066 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.498123 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.497637 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-metrics-server-audit-profiles\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.498123 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.497936 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-audit-log\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.499120 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.499068 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-client-ca-bundle\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.499337 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.499292 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-secret-metrics-server-tls\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.499533 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.499512 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-secret-metrics-server-client-certs\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.505182 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.505149 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfl8q\" (UniqueName: \"kubernetes.io/projected/1b4e5d80-ddd3-4111-9592-2ed3e29a3669-kube-api-access-wfl8q\") pod \"metrics-server-5c68b88bf5-lpq5h\" (UID: \"1b4e5d80-ddd3-4111-9592-2ed3e29a3669\") " pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.645036 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:51.644954 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:17:51.715736 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:17:51.715706 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod133b2c9a_3e65_4237_98e8_b010b89f5025.slice/crio-91bfb4a18bbc64390af1db54a604cdd17be9e567d7754e0e79c3c270f45f3e50 WatchSource:0}: Error finding container 91bfb4a18bbc64390af1db54a604cdd17be9e567d7754e0e79c3c270f45f3e50: Status 404 returned error can't find the container with id 91bfb4a18bbc64390af1db54a604cdd17be9e567d7754e0e79c3c270f45f3e50 Apr 24 21:17:52.000469 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.000393 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ddb507d-944f-4cf5-8f38-319dacf0c8b1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2dmnr\" (UID: \"8ddb507d-944f-4cf5-8f38-319dacf0c8b1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2dmnr" Apr 24 21:17:52.003054 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.003019 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ddb507d-944f-4cf5-8f38-319dacf0c8b1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2dmnr\" (UID: \"8ddb507d-944f-4cf5-8f38-319dacf0c8b1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2dmnr" Apr 24 21:17:52.055011 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.054982 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-764bbb7fd8-mfckr"] Apr 24 21:17:52.058025 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.058004 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.061231 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.061207 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 21:17:52.061324 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.061270 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 21:17:52.061415 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.061397 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-qtdqs\"" Apr 24 21:17:52.062111 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.062095 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 21:17:52.062205 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.062175 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 21:17:52.062416 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.062233 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 21:17:52.068550 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.068532 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 21:17:52.074633 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.074613 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-764bbb7fd8-mfckr"] Apr 24 21:17:52.101867 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.101842 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-secret-telemeter-client\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.101964 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.101885 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-telemeter-client-tls\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.102019 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.101964 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-telemeter-trusted-ca-bundle\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.102019 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.102010 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-serving-certs-ca-bundle\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.102104 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.102034 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dft4k\" (UniqueName: \"kubernetes.io/projected/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-kube-api-access-dft4k\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.102104 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.102077 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-federate-client-tls\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.102185 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.102163 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.102225 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.102208 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-metrics-client-ca\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.203975 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.203363 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-secret-telemeter-client\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.203975 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.203411 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-telemeter-client-tls\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.203975 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.203474 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-telemeter-trusted-ca-bundle\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.203975 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.203513 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-serving-certs-ca-bundle\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.203975 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.203537 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dft4k\" (UniqueName: \"kubernetes.io/projected/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-kube-api-access-dft4k\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.203975 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.203573 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-federate-client-tls\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.203975 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.203623 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.203975 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.203657 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-metrics-client-ca\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.205900 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.205069 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-serving-certs-ca-bundle\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.205900 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.205840 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-telemeter-trusted-ca-bundle\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.206520 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.206481 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-metrics-client-ca\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.207503 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.207456 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.208457 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.208401 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-telemeter-client-tls\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.209791 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.209731 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-secret-telemeter-client\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.210004 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.209985 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-federate-client-tls\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.214263 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.214235 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dft4k\" (UniqueName: \"kubernetes.io/projected/e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b-kube-api-access-dft4k\") pod \"telemeter-client-764bbb7fd8-mfckr\" (UID: \"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b\") " pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.248660 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.248638 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5c68b88bf5-lpq5h"] Apr 24 21:17:52.252457 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:17:52.252375 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b4e5d80_ddd3_4111_9592_2ed3e29a3669.slice/crio-850574572a8e5792d810730a2fda7a5cb8049071e771c68d0a7f996d7b39af20 WatchSource:0}: Error finding container 850574572a8e5792d810730a2fda7a5cb8049071e771c68d0a7f996d7b39af20: Status 404 returned error can't find the container with id 850574572a8e5792d810730a2fda7a5cb8049071e771c68d0a7f996d7b39af20 Apr 24 21:17:52.284913 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.284886 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2dmnr" Apr 24 21:17:52.372527 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.372380 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-qtdqs\"" Apr 24 21:17:52.379609 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.379249 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" Apr 24 21:17:52.453828 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.453114 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-2dmnr"] Apr 24 21:17:52.553627 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.553601 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-764bbb7fd8-mfckr"] Apr 24 21:17:52.556359 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:17:52.556318 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7db6a57_49dd_4d4c_9fdc_1b5a3c89670b.slice/crio-d380227be743386d064513296b111e811dd34bb460d65f2dba279c88b18ce552 WatchSource:0}: Error finding container d380227be743386d064513296b111e811dd34bb460d65f2dba279c88b18ce552: Status 404 returned error can't find the container with id d380227be743386d064513296b111e811dd34bb460d65f2dba279c88b18ce552 Apr 24 21:17:52.704333 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.704298 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" event={"ID":"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b","Type":"ContainerStarted","Data":"d380227be743386d064513296b111e811dd34bb460d65f2dba279c88b18ce552"} Apr 24 21:17:52.705463 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.705433 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" event={"ID":"1b4e5d80-ddd3-4111-9592-2ed3e29a3669","Type":"ContainerStarted","Data":"850574572a8e5792d810730a2fda7a5cb8049071e771c68d0a7f996d7b39af20"} Apr 24 21:17:52.706604 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.706564 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b478d99cb-nn6bh" event={"ID":"133b2c9a-3e65-4237-98e8-b010b89f5025","Type":"ContainerStarted","Data":"91bfb4a18bbc64390af1db54a604cdd17be9e567d7754e0e79c3c270f45f3e50"} Apr 24 21:17:52.709117 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.709091 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" event={"ID":"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa","Type":"ContainerStarted","Data":"bd31335d2fd170732495a15c422437f16a9815970c02f274f9d7f614510d8cfe"} Apr 24 21:17:52.709220 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.709125 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" event={"ID":"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa","Type":"ContainerStarted","Data":"cf6cac45117e0937d052d1f34c1b88697cdb64cebe3b467ab4e57a8a2f75388f"} Apr 24 21:17:52.709220 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.709138 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" event={"ID":"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa","Type":"ContainerStarted","Data":"aee563956e769bea53b33d3b0d3e9af984014b761a38f4a92ef92a6e0b829e12"} Apr 24 21:17:52.712894 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.712871 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"99c697a7-950b-4e94-b800-4425d568df3f","Type":"ContainerStarted","Data":"041827f4739185933a16d112e77a1f82922037ec60504f36e597c3b1936b71da"} Apr 24 21:17:52.712894 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.712897 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"99c697a7-950b-4e94-b800-4425d568df3f","Type":"ContainerStarted","Data":"edde5f4acbfe347b532c1233121b06bc1a41fd8191ffc61afc77cc0c832abb7a"} Apr 24 21:17:52.713024 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.712909 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"99c697a7-950b-4e94-b800-4425d568df3f","Type":"ContainerStarted","Data":"c53d566c4d4998878f69e54f6090ad97fd8fd31e0d674bd83db0378d7ef129a4"} Apr 24 21:17:52.713024 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.712922 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"99c697a7-950b-4e94-b800-4425d568df3f","Type":"ContainerStarted","Data":"94ae04ebb07d3e12122cd6bdc910251bce862073d45bdb0ce68371f36cd9e430"} Apr 24 21:17:52.713024 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.712935 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"99c697a7-950b-4e94-b800-4425d568df3f","Type":"ContainerStarted","Data":"f7fd57e9c203a24b9ef8905325401ae9da103f67c23a37a75f16ea9ccc9e7c3f"} Apr 24 21:17:52.714255 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:52.714229 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2dmnr" event={"ID":"8ddb507d-944f-4cf5-8f38-319dacf0c8b1","Type":"ContainerStarted","Data":"017a50893e5a9358123861421c990da6ee16848bdf9ad9097cf579f5dd21d51c"} Apr 24 21:17:53.102709 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.102649 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:17:53.108777 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.108386 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.125547 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.123522 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:17:53.125547 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.123920 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:17:53.126493 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.125810 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:17:53.126493 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.126010 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:17:53.126493 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.126068 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-h2dg2\"" Apr 24 21:17:53.126493 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.126191 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:17:53.126493 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.126344 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:17:53.126493 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.126359 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:17:53.127199 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.126995 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:17:53.127356 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.127321 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-123athu0f2uqi\"" Apr 24 21:17:53.128887 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.128696 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:17:53.131145 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.131126 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:17:53.135078 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.135051 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:17:53.143386 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.143362 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:17:53.147632 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.147428 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:17:53.212814 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.212636 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.212814 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.212681 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.212814 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.212712 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-web-config\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.212814 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.212779 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.213601 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.213502 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.214018 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.213895 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/729a228e-b1f6-421c-aefd-99947f19c1fc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.214018 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.213932 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.214779 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.214440 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.214779 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.214496 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.214779 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.214523 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.214779 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.214555 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/729a228e-b1f6-421c-aefd-99947f19c1fc-config-out\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.214779 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.214580 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.214779 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.214639 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.214779 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.214667 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.214779 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.214701 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-config\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.214779 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.214729 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snvjl\" (UniqueName: \"kubernetes.io/projected/729a228e-b1f6-421c-aefd-99947f19c1fc-kube-api-access-snvjl\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.214779 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.214782 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.215798 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.214806 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.251101 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.251077 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fsxch" Apr 24 21:17:53.315847 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.315738 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snvjl\" (UniqueName: \"kubernetes.io/projected/729a228e-b1f6-421c-aefd-99947f19c1fc-kube-api-access-snvjl\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.315847 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.315804 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.315847 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.315833 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.316749 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.316724 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.316862 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.316789 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.316862 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.316820 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-web-config\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.316976 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.316879 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.316976 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.316913 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.316976 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.316951 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/729a228e-b1f6-421c-aefd-99947f19c1fc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.317126 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.316980 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.317126 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.317011 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.317126 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.317076 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.317126 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.317101 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.317332 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.317133 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.317332 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.317168 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/729a228e-b1f6-421c-aefd-99947f19c1fc-config-out\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.317332 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.317194 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.317332 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.317297 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.317529 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.317332 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.317529 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.317380 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-config\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.317886 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.317863 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.319095 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.319071 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.321195 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.321170 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.323420 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.323396 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.325410 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.325110 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.327078 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.326513 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-config\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.327878 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.327693 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.330595 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.328417 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.330595 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.328868 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.330595 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.329298 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.330595 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.330100 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/729a228e-b1f6-421c-aefd-99947f19c1fc-config-out\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.330595 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.330461 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/729a228e-b1f6-421c-aefd-99947f19c1fc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.331490 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.331446 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.331964 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.331924 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-web-config\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.333160 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.332675 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snvjl\" (UniqueName: \"kubernetes.io/projected/729a228e-b1f6-421c-aefd-99947f19c1fc-kube-api-access-snvjl\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.333160 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.333121 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.333779 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.333741 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.432117 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.431801 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:17:53.635491 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.635394 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:17:53.722390 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.722295 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" event={"ID":"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa","Type":"ContainerStarted","Data":"055f62916ff5b2c817ee9cf9a5b78622c6dd8580e14f3c5250b807daa62ab931"} Apr 24 21:17:53.722390 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.722364 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" event={"ID":"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa","Type":"ContainerStarted","Data":"a779ea7f3343d73e11ffc53f615f99b019f0f76ba10b165574624109f29ce272"} Apr 24 21:17:53.726684 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.726351 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"99c697a7-950b-4e94-b800-4425d568df3f","Type":"ContainerStarted","Data":"d24219fee94e32ba7c08f7958022a5deaa6f06e56342a4dd3e24bf43dfc59301"} Apr 24 21:17:53.756660 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:53.756608 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.818032364 podStartE2EDuration="6.756591391s" podCreationTimestamp="2026-04-24 21:17:47 +0000 UTC" firstStartedPulling="2026-04-24 21:17:48.329318078 +0000 UTC m=+56.463716692" lastFinishedPulling="2026-04-24 21:17:53.267877105 +0000 UTC m=+61.402275719" observedRunningTime="2026-04-24 21:17:53.755648277 +0000 UTC m=+61.890046902" watchObservedRunningTime="2026-04-24 21:17:53.756591391 +0000 UTC m=+61.890990015" Apr 24 21:17:54.039228 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:17:54.039204 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod729a228e_b1f6_421c_aefd_99947f19c1fc.slice/crio-e1a29be36495591d58b6583ad7c374b16fae31185673d620b46ada0ed42d9284 WatchSource:0}: Error finding container e1a29be36495591d58b6583ad7c374b16fae31185673d620b46ada0ed42d9284: Status 404 returned error can't find the container with id e1a29be36495591d58b6583ad7c374b16fae31185673d620b46ada0ed42d9284 Apr 24 21:17:54.731085 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:54.730999 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"729a228e-b1f6-421c-aefd-99947f19c1fc","Type":"ContainerStarted","Data":"e1a29be36495591d58b6583ad7c374b16fae31185673d620b46ada0ed42d9284"} Apr 24 21:17:56.178508 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.178466 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b478d99cb-nn6bh"] Apr 24 21:17:56.669781 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.669736 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cdfn8" Apr 24 21:17:56.738376 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.738341 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2dmnr" event={"ID":"8ddb507d-944f-4cf5-8f38-319dacf0c8b1","Type":"ContainerStarted","Data":"0f94b9f20f12150c6434e797ed3c1400f715e03da2adab09bbfd359d9daa6fa9"} Apr 24 21:17:56.738563 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.738521 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2dmnr" Apr 24 21:17:56.739950 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.739924 2569 generic.go:358] "Generic (PLEG): container finished" podID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerID="cce2a5d16c58c100ac32248036bae2de098e84ade8ecc35175cafca12d52c2f4" exitCode=0 Apr 24 21:17:56.740053 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.739951 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"729a228e-b1f6-421c-aefd-99947f19c1fc","Type":"ContainerDied","Data":"cce2a5d16c58c100ac32248036bae2de098e84ade8ecc35175cafca12d52c2f4"} Apr 24 21:17:56.741913 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.741875 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" event={"ID":"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b","Type":"ContainerStarted","Data":"a5c6f41857cee64c5df12091507e747aaccbae4c7f1d92fc9ec3861463dbc5b5"} Apr 24 21:17:56.741995 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.741928 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" event={"ID":"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b","Type":"ContainerStarted","Data":"94dccbada01fecd8d2e82984fac293e3eda4cab53c2638720dd20196c9a379a8"} Apr 24 21:17:56.741995 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.741943 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" event={"ID":"e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b","Type":"ContainerStarted","Data":"1b046fd88b524663d4588d55ea5d29ed1cd1c50d1e26fb5b5964d47afb96217e"} Apr 24 21:17:56.743365 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.743345 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" event={"ID":"1b4e5d80-ddd3-4111-9592-2ed3e29a3669","Type":"ContainerStarted","Data":"3debbe1ebaa1f8fd084d40c0fa07a247319d7b9b27c6231cf94bfdf52be9d98c"} Apr 24 21:17:56.743821 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.743801 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2dmnr" Apr 24 21:17:56.744738 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.744718 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b478d99cb-nn6bh" event={"ID":"133b2c9a-3e65-4237-98e8-b010b89f5025","Type":"ContainerStarted","Data":"3fb4df13674e8db18d041c8808ea610fd6b57c92ce8916ab98b4d70b6914a256"} Apr 24 21:17:56.747688 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.747670 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" event={"ID":"dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa","Type":"ContainerStarted","Data":"5445c7418e62b06c4f7000f51d7276edbb34b2f822ea430a6cf1bb1ce44df9a4"} Apr 24 21:17:56.747806 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.747795 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:56.759487 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.759448 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2dmnr" podStartSLOduration=2.302550267 podStartE2EDuration="5.759436722s" podCreationTimestamp="2026-04-24 21:17:51 +0000 UTC" firstStartedPulling="2026-04-24 21:17:52.454894117 +0000 UTC m=+60.589292718" lastFinishedPulling="2026-04-24 21:17:55.911780561 +0000 UTC m=+64.046179173" observedRunningTime="2026-04-24 21:17:56.758015497 +0000 UTC m=+64.892414122" watchObservedRunningTime="2026-04-24 21:17:56.759436722 +0000 UTC m=+64.893835343" Apr 24 21:17:56.779165 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.779130 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" podStartSLOduration=2.122650141 podStartE2EDuration="5.779120975s" podCreationTimestamp="2026-04-24 21:17:51 +0000 UTC" firstStartedPulling="2026-04-24 21:17:52.254847602 +0000 UTC m=+60.389246218" lastFinishedPulling="2026-04-24 21:17:55.91131845 +0000 UTC m=+64.045717052" observedRunningTime="2026-04-24 21:17:56.77887814 +0000 UTC m=+64.913276784" watchObservedRunningTime="2026-04-24 21:17:56.779120975 +0000 UTC m=+64.913519597" Apr 24 21:17:56.863340 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.863289 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" podStartSLOduration=5.650995785 podStartE2EDuration="8.863275215s" podCreationTimestamp="2026-04-24 21:17:48 +0000 UTC" firstStartedPulling="2026-04-24 21:17:50.052336342 +0000 UTC m=+58.186734941" lastFinishedPulling="2026-04-24 21:17:53.264615764 +0000 UTC m=+61.399014371" observedRunningTime="2026-04-24 21:17:56.862344489 +0000 UTC m=+64.996743109" watchObservedRunningTime="2026-04-24 21:17:56.863275215 +0000 UTC m=+64.997673836" Apr 24 21:17:56.941378 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:56.941296 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-764bbb7fd8-mfckr" podStartSLOduration=1.5887348239999999 podStartE2EDuration="4.9412822s" podCreationTimestamp="2026-04-24 21:17:52 +0000 UTC" firstStartedPulling="2026-04-24 21:17:52.558378583 +0000 UTC m=+60.692777182" lastFinishedPulling="2026-04-24 21:17:55.910925955 +0000 UTC m=+64.045324558" observedRunningTime="2026-04-24 21:17:56.94009756 +0000 UTC m=+65.074496184" watchObservedRunningTime="2026-04-24 21:17:56.9412822 +0000 UTC m=+65.075680821" Apr 24 21:17:57.760060 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:57.759886 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5b99dfb4fb-nzvxg" Apr 24 21:17:57.785100 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:57.785051 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b478d99cb-nn6bh" podStartSLOduration=3.591417849 podStartE2EDuration="7.785033284s" podCreationTimestamp="2026-04-24 21:17:50 +0000 UTC" firstStartedPulling="2026-04-24 21:17:51.717647 +0000 UTC m=+59.852045599" lastFinishedPulling="2026-04-24 21:17:55.911262434 +0000 UTC m=+64.045661034" observedRunningTime="2026-04-24 21:17:56.97262485 +0000 UTC m=+65.107023495" watchObservedRunningTime="2026-04-24 21:17:57.785033284 +0000 UTC m=+65.919431906" Apr 24 21:17:58.168107 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:58.168064 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs\") pod \"network-metrics-daemon-zqp7l\" (UID: \"ddd581ca-fe5d-4e33-965d-ad198f8af209\") " pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:58.170783 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:58.170750 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:17:58.181577 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:58.181543 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd581ca-fe5d-4e33-965d-ad198f8af209-metrics-certs\") pod \"network-metrics-daemon-zqp7l\" (UID: \"ddd581ca-fe5d-4e33-965d-ad198f8af209\") " pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:58.255489 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:58.255465 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gjqgx\"" Apr 24 21:17:58.261788 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:58.261748 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zqp7l" Apr 24 21:17:58.268896 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:58.268871 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5srt\" (UniqueName: \"kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt\") pod \"network-check-target-m9nk2\" (UID: \"ac26e9c0-3977-40b9-a44e-d694b6663276\") " pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:58.271670 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:58.271652 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:17:58.282532 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:58.282511 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:17:58.293559 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:58.293511 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5srt\" (UniqueName: \"kubernetes.io/projected/ac26e9c0-3977-40b9-a44e-d694b6663276-kube-api-access-k5srt\") pod \"network-check-target-m9nk2\" (UID: \"ac26e9c0-3977-40b9-a44e-d694b6663276\") " pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:58.399056 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:58.399030 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zqp7l"] Apr 24 21:17:58.401418 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:17:58.401376 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddd581ca_fe5d_4e33_965d_ad198f8af209.slice/crio-53935728a961f7e4a8ee0c0c73a4c56f1e5f43981ce43a908f0229834de4892a WatchSource:0}: Error finding container 53935728a961f7e4a8ee0c0c73a4c56f1e5f43981ce43a908f0229834de4892a: Status 404 returned error can't find the container with id 53935728a961f7e4a8ee0c0c73a4c56f1e5f43981ce43a908f0229834de4892a Apr 24 21:17:58.548442 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:58.548414 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6pwcs\"" Apr 24 21:17:58.555809 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:58.555790 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:17:58.755741 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:58.755696 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zqp7l" event={"ID":"ddd581ca-fe5d-4e33-965d-ad198f8af209","Type":"ContainerStarted","Data":"53935728a961f7e4a8ee0c0c73a4c56f1e5f43981ce43a908f0229834de4892a"} Apr 24 21:17:59.631680 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:59.631649 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m9nk2"] Apr 24 21:17:59.635325 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:17:59.635290 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac26e9c0_3977_40b9_a44e_d694b6663276.slice/crio-d89782081fed7bbbd7e76e5fea737b79426e52be620efaf5270d3d9346386664 WatchSource:0}: Error finding container d89782081fed7bbbd7e76e5fea737b79426e52be620efaf5270d3d9346386664: Status 404 returned error can't find the container with id d89782081fed7bbbd7e76e5fea737b79426e52be620efaf5270d3d9346386664 Apr 24 21:17:59.762082 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:59.762019 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"729a228e-b1f6-421c-aefd-99947f19c1fc","Type":"ContainerStarted","Data":"d80f6b97d6b023a57e9e4c8cb8f4cc0c1b5ef13fb49cf698a0df86d1550c40f7"} Apr 24 21:17:59.762082 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:59.762056 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"729a228e-b1f6-421c-aefd-99947f19c1fc","Type":"ContainerStarted","Data":"8919244014d0b25bc74dc438777044069de98f92d206c63efe295f48659703e3"} Apr 24 21:17:59.763171 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:17:59.763147 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m9nk2" event={"ID":"ac26e9c0-3977-40b9-a44e-d694b6663276","Type":"ContainerStarted","Data":"d89782081fed7bbbd7e76e5fea737b79426e52be620efaf5270d3d9346386664"} Apr 24 21:18:00.768545 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:00.768500 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zqp7l" event={"ID":"ddd581ca-fe5d-4e33-965d-ad198f8af209","Type":"ContainerStarted","Data":"792ec66ed98330d166020c8c58a9add98bee74bf6f5a3211279f6224b96f687c"} Apr 24 21:18:00.768545 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:00.768541 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zqp7l" event={"ID":"ddd581ca-fe5d-4e33-965d-ad198f8af209","Type":"ContainerStarted","Data":"46150247e32423fb2e85cf181238586b00789d104ac7d1681768a94a78ad650e"} Apr 24 21:18:00.772080 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:00.772052 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"729a228e-b1f6-421c-aefd-99947f19c1fc","Type":"ContainerStarted","Data":"856d5c78599a916446b3c0e57aaece5dce88dcf0ed3b9c57e964c514c7f8a79b"} Apr 24 21:18:00.772189 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:00.772086 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"729a228e-b1f6-421c-aefd-99947f19c1fc","Type":"ContainerStarted","Data":"e9c378a7e419b6d57fbaf018451b7a888858089d2112053cca98021b9c28d1e3"} Apr 24 21:18:00.772189 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:00.772100 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"729a228e-b1f6-421c-aefd-99947f19c1fc","Type":"ContainerStarted","Data":"ae10121a8a1acbc0574b2c802f9bde68e0c50ce6a79f54a960ec8688a7f985ff"} Apr 24 21:18:00.772189 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:00.772110 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"729a228e-b1f6-421c-aefd-99947f19c1fc","Type":"ContainerStarted","Data":"1ceb848336074db6369ae11f058829490a6dcf4f0d64b84749e9aaac85b5c666"} Apr 24 21:18:00.783979 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:00.783935 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zqp7l" podStartSLOduration=66.864544221 podStartE2EDuration="1m8.783920974s" podCreationTimestamp="2026-04-24 21:16:52 +0000 UTC" firstStartedPulling="2026-04-24 21:17:58.403684475 +0000 UTC m=+66.538083074" lastFinishedPulling="2026-04-24 21:18:00.323061224 +0000 UTC m=+68.457459827" observedRunningTime="2026-04-24 21:18:00.782602439 +0000 UTC m=+68.917001062" watchObservedRunningTime="2026-04-24 21:18:00.783920974 +0000 UTC m=+68.918319593" Apr 24 21:18:00.813038 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:00.812983 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.993560253 podStartE2EDuration="7.81296882s" podCreationTimestamp="2026-04-24 21:17:53 +0000 UTC" firstStartedPulling="2026-04-24 21:17:56.741362811 +0000 UTC m=+64.875761413" lastFinishedPulling="2026-04-24 21:17:59.560771365 +0000 UTC m=+67.695169980" observedRunningTime="2026-04-24 21:18:00.810859117 +0000 UTC m=+68.945257751" watchObservedRunningTime="2026-04-24 21:18:00.81296882 +0000 UTC m=+68.947367440" Apr 24 21:18:01.084871 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:01.084839 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:18:02.780534 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:02.780502 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m9nk2" event={"ID":"ac26e9c0-3977-40b9-a44e-d694b6663276","Type":"ContainerStarted","Data":"051d2eb8328fdad6166aa1ab9e7d446899181001d71c497a13466d7f9902262f"} Apr 24 21:18:02.780887 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:02.780589 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:18:02.798146 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:02.798099 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-m9nk2" podStartSLOduration=68.157189221 podStartE2EDuration="1m10.798084508s" podCreationTimestamp="2026-04-24 21:16:52 +0000 UTC" firstStartedPulling="2026-04-24 21:17:59.637276123 +0000 UTC m=+67.771674727" lastFinishedPulling="2026-04-24 21:18:02.278171412 +0000 UTC m=+70.412570014" observedRunningTime="2026-04-24 21:18:02.797332284 +0000 UTC m=+70.931730916" watchObservedRunningTime="2026-04-24 21:18:02.798084508 +0000 UTC m=+70.932483129" Apr 24 21:18:03.432722 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:03.432682 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:11.645805 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:11.645662 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:18:11.645805 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:11.645705 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:18:22.774339 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:22.774298 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5b478d99cb-nn6bh" podUID="133b2c9a-3e65-4237-98e8-b010b89f5025" containerName="console" containerID="cri-o://3fb4df13674e8db18d041c8808ea610fd6b57c92ce8916ab98b4d70b6914a256" gracePeriod=15 Apr 24 21:18:23.030132 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.030109 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b478d99cb-nn6bh_133b2c9a-3e65-4237-98e8-b010b89f5025/console/0.log" Apr 24 21:18:23.030280 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.030178 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:18:23.175585 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.175560 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/133b2c9a-3e65-4237-98e8-b010b89f5025-console-serving-cert\") pod \"133b2c9a-3e65-4237-98e8-b010b89f5025\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " Apr 24 21:18:23.175722 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.175594 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw4p9\" (UniqueName: \"kubernetes.io/projected/133b2c9a-3e65-4237-98e8-b010b89f5025-kube-api-access-rw4p9\") pod \"133b2c9a-3e65-4237-98e8-b010b89f5025\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " Apr 24 21:18:23.175722 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.175627 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-trusted-ca-bundle\") pod \"133b2c9a-3e65-4237-98e8-b010b89f5025\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " Apr 24 21:18:23.175827 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.175747 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-oauth-serving-cert\") pod \"133b2c9a-3e65-4237-98e8-b010b89f5025\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " Apr 24 21:18:23.175827 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.175806 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-console-config\") pod \"133b2c9a-3e65-4237-98e8-b010b89f5025\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " Apr 24 21:18:23.175937 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.175840 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/133b2c9a-3e65-4237-98e8-b010b89f5025-console-oauth-config\") pod \"133b2c9a-3e65-4237-98e8-b010b89f5025\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " Apr 24 21:18:23.175937 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.175888 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-service-ca\") pod \"133b2c9a-3e65-4237-98e8-b010b89f5025\" (UID: \"133b2c9a-3e65-4237-98e8-b010b89f5025\") " Apr 24 21:18:23.176156 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.176040 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "133b2c9a-3e65-4237-98e8-b010b89f5025" (UID: "133b2c9a-3e65-4237-98e8-b010b89f5025"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:23.176273 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.176151 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "133b2c9a-3e65-4237-98e8-b010b89f5025" (UID: "133b2c9a-3e65-4237-98e8-b010b89f5025"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:23.176343 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.176281 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-console-config" (OuterVolumeSpecName: "console-config") pod "133b2c9a-3e65-4237-98e8-b010b89f5025" (UID: "133b2c9a-3e65-4237-98e8-b010b89f5025"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:23.176404 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.176384 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-service-ca" (OuterVolumeSpecName: "service-ca") pod "133b2c9a-3e65-4237-98e8-b010b89f5025" (UID: "133b2c9a-3e65-4237-98e8-b010b89f5025"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:23.177815 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.177791 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/133b2c9a-3e65-4237-98e8-b010b89f5025-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "133b2c9a-3e65-4237-98e8-b010b89f5025" (UID: "133b2c9a-3e65-4237-98e8-b010b89f5025"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:23.177904 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.177854 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/133b2c9a-3e65-4237-98e8-b010b89f5025-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "133b2c9a-3e65-4237-98e8-b010b89f5025" (UID: "133b2c9a-3e65-4237-98e8-b010b89f5025"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:23.177904 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.177881 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/133b2c9a-3e65-4237-98e8-b010b89f5025-kube-api-access-rw4p9" (OuterVolumeSpecName: "kube-api-access-rw4p9") pod "133b2c9a-3e65-4237-98e8-b010b89f5025" (UID: "133b2c9a-3e65-4237-98e8-b010b89f5025"). InnerVolumeSpecName "kube-api-access-rw4p9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:23.276924 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.276899 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-service-ca\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:18:23.276924 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.276920 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/133b2c9a-3e65-4237-98e8-b010b89f5025-console-serving-cert\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:18:23.277074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.276931 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rw4p9\" (UniqueName: \"kubernetes.io/projected/133b2c9a-3e65-4237-98e8-b010b89f5025-kube-api-access-rw4p9\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:18:23.277074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.276941 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-trusted-ca-bundle\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:18:23.277074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.276949 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-oauth-serving-cert\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:18:23.277074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.276958 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/133b2c9a-3e65-4237-98e8-b010b89f5025-console-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:18:23.277074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.276966 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/133b2c9a-3e65-4237-98e8-b010b89f5025-console-oauth-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:18:23.840852 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.840803 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b478d99cb-nn6bh_133b2c9a-3e65-4237-98e8-b010b89f5025/console/0.log" Apr 24 21:18:23.840852 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.840846 2569 generic.go:358] "Generic (PLEG): container finished" podID="133b2c9a-3e65-4237-98e8-b010b89f5025" containerID="3fb4df13674e8db18d041c8808ea610fd6b57c92ce8916ab98b4d70b6914a256" exitCode=2 Apr 24 21:18:23.841292 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.840931 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b478d99cb-nn6bh" event={"ID":"133b2c9a-3e65-4237-98e8-b010b89f5025","Type":"ContainerDied","Data":"3fb4df13674e8db18d041c8808ea610fd6b57c92ce8916ab98b4d70b6914a256"} Apr 24 21:18:23.841292 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.840950 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b478d99cb-nn6bh" Apr 24 21:18:23.841292 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.840961 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b478d99cb-nn6bh" event={"ID":"133b2c9a-3e65-4237-98e8-b010b89f5025","Type":"ContainerDied","Data":"91bfb4a18bbc64390af1db54a604cdd17be9e567d7754e0e79c3c270f45f3e50"} Apr 24 21:18:23.841292 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.840978 2569 scope.go:117] "RemoveContainer" containerID="3fb4df13674e8db18d041c8808ea610fd6b57c92ce8916ab98b4d70b6914a256" Apr 24 21:18:23.861936 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.861918 2569 scope.go:117] "RemoveContainer" containerID="3fb4df13674e8db18d041c8808ea610fd6b57c92ce8916ab98b4d70b6914a256" Apr 24 21:18:23.862263 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:18:23.862234 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fb4df13674e8db18d041c8808ea610fd6b57c92ce8916ab98b4d70b6914a256\": container with ID starting with 3fb4df13674e8db18d041c8808ea610fd6b57c92ce8916ab98b4d70b6914a256 not found: ID does not exist" containerID="3fb4df13674e8db18d041c8808ea610fd6b57c92ce8916ab98b4d70b6914a256" Apr 24 21:18:23.862333 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.862275 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fb4df13674e8db18d041c8808ea610fd6b57c92ce8916ab98b4d70b6914a256"} err="failed to get container status \"3fb4df13674e8db18d041c8808ea610fd6b57c92ce8916ab98b4d70b6914a256\": rpc error: code = NotFound desc = could not find container \"3fb4df13674e8db18d041c8808ea610fd6b57c92ce8916ab98b4d70b6914a256\": container with ID starting with 3fb4df13674e8db18d041c8808ea610fd6b57c92ce8916ab98b4d70b6914a256 not found: ID does not exist" Apr 24 21:18:23.863067 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.863049 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b478d99cb-nn6bh"] Apr 24 21:18:23.866299 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:23.866279 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5b478d99cb-nn6bh"] Apr 24 21:18:24.436436 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:24.436404 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="133b2c9a-3e65-4237-98e8-b010b89f5025" path="/var/lib/kubelet/pods/133b2c9a-3e65-4237-98e8-b010b89f5025/volumes" Apr 24 21:18:31.650196 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:31.650169 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:18:31.654085 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:31.654063 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5c68b88bf5-lpq5h" Apr 24 21:18:33.786414 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:33.786385 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-m9nk2" Apr 24 21:18:53.432815 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:53.432783 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:53.451751 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:53.451728 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:53.937533 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:18:53.937500 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:07.206411 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.206377 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:19:07.206912 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.206860 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="alertmanager" containerID="cri-o://f7fd57e9c203a24b9ef8905325401ae9da103f67c23a37a75f16ea9ccc9e7c3f" gracePeriod=120 Apr 24 21:19:07.206991 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.206923 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="kube-rbac-proxy-metric" containerID="cri-o://041827f4739185933a16d112e77a1f82922037ec60504f36e597c3b1936b71da" gracePeriod=120 Apr 24 21:19:07.207046 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.206985 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="prom-label-proxy" containerID="cri-o://d24219fee94e32ba7c08f7958022a5deaa6f06e56342a4dd3e24bf43dfc59301" gracePeriod=120 Apr 24 21:19:07.207098 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.207041 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="config-reloader" containerID="cri-o://94ae04ebb07d3e12122cd6bdc910251bce862073d45bdb0ce68371f36cd9e430" gracePeriod=120 Apr 24 21:19:07.207098 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.206935 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="kube-rbac-proxy-web" containerID="cri-o://c53d566c4d4998878f69e54f6090ad97fd8fd31e0d674bd83db0378d7ef129a4" gracePeriod=120 Apr 24 21:19:07.207202 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.206964 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="kube-rbac-proxy" containerID="cri-o://edde5f4acbfe347b532c1233121b06bc1a41fd8191ffc61afc77cc0c832abb7a" gracePeriod=120 Apr 24 21:19:07.970879 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.970841 2569 generic.go:358] "Generic (PLEG): container finished" podID="99c697a7-950b-4e94-b800-4425d568df3f" containerID="d24219fee94e32ba7c08f7958022a5deaa6f06e56342a4dd3e24bf43dfc59301" exitCode=0 Apr 24 21:19:07.970879 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.970865 2569 generic.go:358] "Generic (PLEG): container finished" podID="99c697a7-950b-4e94-b800-4425d568df3f" containerID="041827f4739185933a16d112e77a1f82922037ec60504f36e597c3b1936b71da" exitCode=0 Apr 24 21:19:07.970879 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.970870 2569 generic.go:358] "Generic (PLEG): container finished" podID="99c697a7-950b-4e94-b800-4425d568df3f" containerID="edde5f4acbfe347b532c1233121b06bc1a41fd8191ffc61afc77cc0c832abb7a" exitCode=0 Apr 24 21:19:07.970879 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.970876 2569 generic.go:358] "Generic (PLEG): container finished" podID="99c697a7-950b-4e94-b800-4425d568df3f" containerID="94ae04ebb07d3e12122cd6bdc910251bce862073d45bdb0ce68371f36cd9e430" exitCode=0 Apr 24 21:19:07.970879 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.970881 2569 generic.go:358] "Generic (PLEG): container finished" podID="99c697a7-950b-4e94-b800-4425d568df3f" containerID="f7fd57e9c203a24b9ef8905325401ae9da103f67c23a37a75f16ea9ccc9e7c3f" exitCode=0 Apr 24 21:19:07.971153 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.970920 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"99c697a7-950b-4e94-b800-4425d568df3f","Type":"ContainerDied","Data":"d24219fee94e32ba7c08f7958022a5deaa6f06e56342a4dd3e24bf43dfc59301"} Apr 24 21:19:07.971153 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.970957 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"99c697a7-950b-4e94-b800-4425d568df3f","Type":"ContainerDied","Data":"041827f4739185933a16d112e77a1f82922037ec60504f36e597c3b1936b71da"} Apr 24 21:19:07.971153 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.970967 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"99c697a7-950b-4e94-b800-4425d568df3f","Type":"ContainerDied","Data":"edde5f4acbfe347b532c1233121b06bc1a41fd8191ffc61afc77cc0c832abb7a"} Apr 24 21:19:07.971153 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.970976 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"99c697a7-950b-4e94-b800-4425d568df3f","Type":"ContainerDied","Data":"94ae04ebb07d3e12122cd6bdc910251bce862073d45bdb0ce68371f36cd9e430"} Apr 24 21:19:07.971153 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:07.970985 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"99c697a7-950b-4e94-b800-4425d568df3f","Type":"ContainerDied","Data":"f7fd57e9c203a24b9ef8905325401ae9da103f67c23a37a75f16ea9ccc9e7c3f"} Apr 24 21:19:08.454802 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.454781 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:08.529241 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.529206 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c697a7-950b-4e94-b800-4425d568df3f-alertmanager-trusted-ca-bundle\") pod \"99c697a7-950b-4e94-b800-4425d568df3f\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " Apr 24 21:19:08.529400 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.529249 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/99c697a7-950b-4e94-b800-4425d568df3f-alertmanager-main-db\") pod \"99c697a7-950b-4e94-b800-4425d568df3f\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " Apr 24 21:19:08.529400 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.529276 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-config-volume\") pod \"99c697a7-950b-4e94-b800-4425d568df3f\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " Apr 24 21:19:08.529400 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.529291 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-main-tls\") pod \"99c697a7-950b-4e94-b800-4425d568df3f\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " Apr 24 21:19:08.529400 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.529336 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy-web\") pod \"99c697a7-950b-4e94-b800-4425d568df3f\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " Apr 24 21:19:08.529598 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.529475 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy\") pod \"99c697a7-950b-4e94-b800-4425d568df3f\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " Apr 24 21:19:08.529598 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.529517 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-web-config\") pod \"99c697a7-950b-4e94-b800-4425d568df3f\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " Apr 24 21:19:08.529598 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.529563 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99c697a7-950b-4e94-b800-4425d568df3f-config-out\") pod \"99c697a7-950b-4e94-b800-4425d568df3f\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " Apr 24 21:19:08.529738 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.529594 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwpsc\" (UniqueName: \"kubernetes.io/projected/99c697a7-950b-4e94-b800-4425d568df3f-kube-api-access-qwpsc\") pod \"99c697a7-950b-4e94-b800-4425d568df3f\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " Apr 24 21:19:08.529738 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.529604 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99c697a7-950b-4e94-b800-4425d568df3f-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "99c697a7-950b-4e94-b800-4425d568df3f" (UID: "99c697a7-950b-4e94-b800-4425d568df3f"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:08.529738 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.529639 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99c697a7-950b-4e94-b800-4425d568df3f-tls-assets\") pod \"99c697a7-950b-4e94-b800-4425d568df3f\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " Apr 24 21:19:08.529738 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.529657 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99c697a7-950b-4e94-b800-4425d568df3f-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "99c697a7-950b-4e94-b800-4425d568df3f" (UID: "99c697a7-950b-4e94-b800-4425d568df3f"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:19:08.529738 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.529689 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-cluster-tls-config\") pod \"99c697a7-950b-4e94-b800-4425d568df3f\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " Apr 24 21:19:08.529738 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.529715 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99c697a7-950b-4e94-b800-4425d568df3f-metrics-client-ca\") pod \"99c697a7-950b-4e94-b800-4425d568df3f\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " Apr 24 21:19:08.530063 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.529745 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"99c697a7-950b-4e94-b800-4425d568df3f\" (UID: \"99c697a7-950b-4e94-b800-4425d568df3f\") " Apr 24 21:19:08.530122 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.530062 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c697a7-950b-4e94-b800-4425d568df3f-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:08.530122 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.530083 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/99c697a7-950b-4e94-b800-4425d568df3f-alertmanager-main-db\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:08.531424 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.531273 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99c697a7-950b-4e94-b800-4425d568df3f-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "99c697a7-950b-4e94-b800-4425d568df3f" (UID: "99c697a7-950b-4e94-b800-4425d568df3f"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:08.532486 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.532459 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "99c697a7-950b-4e94-b800-4425d568df3f" (UID: "99c697a7-950b-4e94-b800-4425d568df3f"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:08.532609 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.532585 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99c697a7-950b-4e94-b800-4425d568df3f-config-out" (OuterVolumeSpecName: "config-out") pod "99c697a7-950b-4e94-b800-4425d568df3f" (UID: "99c697a7-950b-4e94-b800-4425d568df3f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:19:08.532750 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.532725 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "99c697a7-950b-4e94-b800-4425d568df3f" (UID: "99c697a7-950b-4e94-b800-4425d568df3f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:08.532903 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.532885 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c697a7-950b-4e94-b800-4425d568df3f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "99c697a7-950b-4e94-b800-4425d568df3f" (UID: "99c697a7-950b-4e94-b800-4425d568df3f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:19:08.533021 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.532994 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "99c697a7-950b-4e94-b800-4425d568df3f" (UID: "99c697a7-950b-4e94-b800-4425d568df3f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:08.533196 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.533175 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-config-volume" (OuterVolumeSpecName: "config-volume") pod "99c697a7-950b-4e94-b800-4425d568df3f" (UID: "99c697a7-950b-4e94-b800-4425d568df3f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:08.533309 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.533293 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c697a7-950b-4e94-b800-4425d568df3f-kube-api-access-qwpsc" (OuterVolumeSpecName: "kube-api-access-qwpsc") pod "99c697a7-950b-4e94-b800-4425d568df3f" (UID: "99c697a7-950b-4e94-b800-4425d568df3f"). InnerVolumeSpecName "kube-api-access-qwpsc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:19:08.533662 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.533645 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "99c697a7-950b-4e94-b800-4425d568df3f" (UID: "99c697a7-950b-4e94-b800-4425d568df3f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:08.537160 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.537141 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "99c697a7-950b-4e94-b800-4425d568df3f" (UID: "99c697a7-950b-4e94-b800-4425d568df3f"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:08.543022 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.542999 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-web-config" (OuterVolumeSpecName: "web-config") pod "99c697a7-950b-4e94-b800-4425d568df3f" (UID: "99c697a7-950b-4e94-b800-4425d568df3f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:08.631010 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.630985 2569 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-config-volume\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:08.631010 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.631008 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-main-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:08.631133 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.631019 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:08.631133 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.631029 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:08.631133 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.631038 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-web-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:08.631133 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.631046 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99c697a7-950b-4e94-b800-4425d568df3f-config-out\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:08.631133 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.631055 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwpsc\" (UniqueName: \"kubernetes.io/projected/99c697a7-950b-4e94-b800-4425d568df3f-kube-api-access-qwpsc\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:08.631133 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.631066 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99c697a7-950b-4e94-b800-4425d568df3f-tls-assets\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:08.631133 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.631075 2569 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-cluster-tls-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:08.631133 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.631082 2569 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99c697a7-950b-4e94-b800-4425d568df3f-metrics-client-ca\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:08.631133 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.631092 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/99c697a7-950b-4e94-b800-4425d568df3f-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:08.976640 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.976566 2569 generic.go:358] "Generic (PLEG): container finished" podID="99c697a7-950b-4e94-b800-4425d568df3f" containerID="c53d566c4d4998878f69e54f6090ad97fd8fd31e0d674bd83db0378d7ef129a4" exitCode=0 Apr 24 21:19:08.976774 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.976648 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"99c697a7-950b-4e94-b800-4425d568df3f","Type":"ContainerDied","Data":"c53d566c4d4998878f69e54f6090ad97fd8fd31e0d674bd83db0378d7ef129a4"} Apr 24 21:19:08.976774 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.976687 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:08.976774 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.976702 2569 scope.go:117] "RemoveContainer" containerID="d24219fee94e32ba7c08f7958022a5deaa6f06e56342a4dd3e24bf43dfc59301" Apr 24 21:19:08.976904 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.976691 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"99c697a7-950b-4e94-b800-4425d568df3f","Type":"ContainerDied","Data":"c359f6115ba9d2c461ae37b9975ae02d15fec512e27874c0760bc4f0d70f4444"} Apr 24 21:19:08.985496 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.985377 2569 scope.go:117] "RemoveContainer" containerID="041827f4739185933a16d112e77a1f82922037ec60504f36e597c3b1936b71da" Apr 24 21:19:08.991862 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.991847 2569 scope.go:117] "RemoveContainer" containerID="edde5f4acbfe347b532c1233121b06bc1a41fd8191ffc61afc77cc0c832abb7a" Apr 24 21:19:08.997750 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:08.997733 2569 scope.go:117] "RemoveContainer" containerID="c53d566c4d4998878f69e54f6090ad97fd8fd31e0d674bd83db0378d7ef129a4" Apr 24 21:19:09.004360 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.004339 2569 scope.go:117] "RemoveContainer" containerID="94ae04ebb07d3e12122cd6bdc910251bce862073d45bdb0ce68371f36cd9e430" Apr 24 21:19:09.004963 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.004943 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:19:09.009251 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.009229 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:19:09.011448 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.011431 2569 scope.go:117] "RemoveContainer" containerID="f7fd57e9c203a24b9ef8905325401ae9da103f67c23a37a75f16ea9ccc9e7c3f" Apr 24 21:19:09.017467 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.017451 2569 scope.go:117] "RemoveContainer" containerID="7cf70100a50031dc7869929184bff8cce72f5cb83fcca8fbc853308a711bb1f5" Apr 24 21:19:09.023326 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.023312 2569 scope.go:117] "RemoveContainer" containerID="d24219fee94e32ba7c08f7958022a5deaa6f06e56342a4dd3e24bf43dfc59301" Apr 24 21:19:09.023561 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:19:09.023542 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d24219fee94e32ba7c08f7958022a5deaa6f06e56342a4dd3e24bf43dfc59301\": container with ID starting with d24219fee94e32ba7c08f7958022a5deaa6f06e56342a4dd3e24bf43dfc59301 not found: ID does not exist" containerID="d24219fee94e32ba7c08f7958022a5deaa6f06e56342a4dd3e24bf43dfc59301" Apr 24 21:19:09.023646 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.023565 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d24219fee94e32ba7c08f7958022a5deaa6f06e56342a4dd3e24bf43dfc59301"} err="failed to get container status \"d24219fee94e32ba7c08f7958022a5deaa6f06e56342a4dd3e24bf43dfc59301\": rpc error: code = NotFound desc = could not find container \"d24219fee94e32ba7c08f7958022a5deaa6f06e56342a4dd3e24bf43dfc59301\": container with ID starting with d24219fee94e32ba7c08f7958022a5deaa6f06e56342a4dd3e24bf43dfc59301 not found: ID does not exist" Apr 24 21:19:09.023646 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.023584 2569 scope.go:117] "RemoveContainer" containerID="041827f4739185933a16d112e77a1f82922037ec60504f36e597c3b1936b71da" Apr 24 21:19:09.023838 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:19:09.023820 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"041827f4739185933a16d112e77a1f82922037ec60504f36e597c3b1936b71da\": container with ID starting with 041827f4739185933a16d112e77a1f82922037ec60504f36e597c3b1936b71da not found: ID does not exist" containerID="041827f4739185933a16d112e77a1f82922037ec60504f36e597c3b1936b71da" Apr 24 21:19:09.023891 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.023843 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"041827f4739185933a16d112e77a1f82922037ec60504f36e597c3b1936b71da"} err="failed to get container status \"041827f4739185933a16d112e77a1f82922037ec60504f36e597c3b1936b71da\": rpc error: code = NotFound desc = could not find container \"041827f4739185933a16d112e77a1f82922037ec60504f36e597c3b1936b71da\": container with ID starting with 041827f4739185933a16d112e77a1f82922037ec60504f36e597c3b1936b71da not found: ID does not exist" Apr 24 21:19:09.023891 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.023859 2569 scope.go:117] "RemoveContainer" containerID="edde5f4acbfe347b532c1233121b06bc1a41fd8191ffc61afc77cc0c832abb7a" Apr 24 21:19:09.024073 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:19:09.024056 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edde5f4acbfe347b532c1233121b06bc1a41fd8191ffc61afc77cc0c832abb7a\": container with ID starting with edde5f4acbfe347b532c1233121b06bc1a41fd8191ffc61afc77cc0c832abb7a not found: ID does not exist" containerID="edde5f4acbfe347b532c1233121b06bc1a41fd8191ffc61afc77cc0c832abb7a" Apr 24 21:19:09.024137 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.024080 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edde5f4acbfe347b532c1233121b06bc1a41fd8191ffc61afc77cc0c832abb7a"} err="failed to get container status \"edde5f4acbfe347b532c1233121b06bc1a41fd8191ffc61afc77cc0c832abb7a\": rpc error: code = NotFound desc = could not find container \"edde5f4acbfe347b532c1233121b06bc1a41fd8191ffc61afc77cc0c832abb7a\": container with ID starting with edde5f4acbfe347b532c1233121b06bc1a41fd8191ffc61afc77cc0c832abb7a not found: ID does not exist" Apr 24 21:19:09.024137 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.024104 2569 scope.go:117] "RemoveContainer" containerID="c53d566c4d4998878f69e54f6090ad97fd8fd31e0d674bd83db0378d7ef129a4" Apr 24 21:19:09.024350 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:19:09.024331 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c53d566c4d4998878f69e54f6090ad97fd8fd31e0d674bd83db0378d7ef129a4\": container with ID starting with c53d566c4d4998878f69e54f6090ad97fd8fd31e0d674bd83db0378d7ef129a4 not found: ID does not exist" containerID="c53d566c4d4998878f69e54f6090ad97fd8fd31e0d674bd83db0378d7ef129a4" Apr 24 21:19:09.024388 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.024356 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c53d566c4d4998878f69e54f6090ad97fd8fd31e0d674bd83db0378d7ef129a4"} err="failed to get container status \"c53d566c4d4998878f69e54f6090ad97fd8fd31e0d674bd83db0378d7ef129a4\": rpc error: code = NotFound desc = could not find container \"c53d566c4d4998878f69e54f6090ad97fd8fd31e0d674bd83db0378d7ef129a4\": container with ID starting with c53d566c4d4998878f69e54f6090ad97fd8fd31e0d674bd83db0378d7ef129a4 not found: ID does not exist" Apr 24 21:19:09.024388 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.024371 2569 scope.go:117] "RemoveContainer" containerID="94ae04ebb07d3e12122cd6bdc910251bce862073d45bdb0ce68371f36cd9e430" Apr 24 21:19:09.024583 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:19:09.024568 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94ae04ebb07d3e12122cd6bdc910251bce862073d45bdb0ce68371f36cd9e430\": container with ID starting with 94ae04ebb07d3e12122cd6bdc910251bce862073d45bdb0ce68371f36cd9e430 not found: ID does not exist" containerID="94ae04ebb07d3e12122cd6bdc910251bce862073d45bdb0ce68371f36cd9e430" Apr 24 21:19:09.024625 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.024587 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ae04ebb07d3e12122cd6bdc910251bce862073d45bdb0ce68371f36cd9e430"} err="failed to get container status \"94ae04ebb07d3e12122cd6bdc910251bce862073d45bdb0ce68371f36cd9e430\": rpc error: code = NotFound desc = could not find container \"94ae04ebb07d3e12122cd6bdc910251bce862073d45bdb0ce68371f36cd9e430\": container with ID starting with 94ae04ebb07d3e12122cd6bdc910251bce862073d45bdb0ce68371f36cd9e430 not found: ID does not exist" Apr 24 21:19:09.024625 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.024600 2569 scope.go:117] "RemoveContainer" containerID="f7fd57e9c203a24b9ef8905325401ae9da103f67c23a37a75f16ea9ccc9e7c3f" Apr 24 21:19:09.024835 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:19:09.024818 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7fd57e9c203a24b9ef8905325401ae9da103f67c23a37a75f16ea9ccc9e7c3f\": container with ID starting with f7fd57e9c203a24b9ef8905325401ae9da103f67c23a37a75f16ea9ccc9e7c3f not found: ID does not exist" containerID="f7fd57e9c203a24b9ef8905325401ae9da103f67c23a37a75f16ea9ccc9e7c3f" Apr 24 21:19:09.024910 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.024840 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7fd57e9c203a24b9ef8905325401ae9da103f67c23a37a75f16ea9ccc9e7c3f"} err="failed to get container status \"f7fd57e9c203a24b9ef8905325401ae9da103f67c23a37a75f16ea9ccc9e7c3f\": rpc error: code = NotFound desc = could not find container \"f7fd57e9c203a24b9ef8905325401ae9da103f67c23a37a75f16ea9ccc9e7c3f\": container with ID starting with f7fd57e9c203a24b9ef8905325401ae9da103f67c23a37a75f16ea9ccc9e7c3f not found: ID does not exist" Apr 24 21:19:09.024910 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.024858 2569 scope.go:117] "RemoveContainer" containerID="7cf70100a50031dc7869929184bff8cce72f5cb83fcca8fbc853308a711bb1f5" Apr 24 21:19:09.025097 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:19:09.025082 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cf70100a50031dc7869929184bff8cce72f5cb83fcca8fbc853308a711bb1f5\": container with ID starting with 7cf70100a50031dc7869929184bff8cce72f5cb83fcca8fbc853308a711bb1f5 not found: ID does not exist" containerID="7cf70100a50031dc7869929184bff8cce72f5cb83fcca8fbc853308a711bb1f5" Apr 24 21:19:09.025134 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.025101 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cf70100a50031dc7869929184bff8cce72f5cb83fcca8fbc853308a711bb1f5"} err="failed to get container status \"7cf70100a50031dc7869929184bff8cce72f5cb83fcca8fbc853308a711bb1f5\": rpc error: code = NotFound desc = could not find container \"7cf70100a50031dc7869929184bff8cce72f5cb83fcca8fbc853308a711bb1f5\": container with ID starting with 7cf70100a50031dc7869929184bff8cce72f5cb83fcca8fbc853308a711bb1f5 not found: ID does not exist" Apr 24 21:19:09.037707 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.037685 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:19:09.038024 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038010 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="kube-rbac-proxy-metric" Apr 24 21:19:09.038074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038036 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="kube-rbac-proxy-metric" Apr 24 21:19:09.038074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038046 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="prom-label-proxy" Apr 24 21:19:09.038074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038051 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="prom-label-proxy" Apr 24 21:19:09.038074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038057 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="133b2c9a-3e65-4237-98e8-b010b89f5025" containerName="console" Apr 24 21:19:09.038074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038062 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="133b2c9a-3e65-4237-98e8-b010b89f5025" containerName="console" Apr 24 21:19:09.038074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038069 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="kube-rbac-proxy-web" Apr 24 21:19:09.038074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038074 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="kube-rbac-proxy-web" Apr 24 21:19:09.038281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038082 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="alertmanager" Apr 24 21:19:09.038281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038087 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="alertmanager" Apr 24 21:19:09.038281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038093 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="init-config-reloader" Apr 24 21:19:09.038281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038099 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="init-config-reloader" Apr 24 21:19:09.038281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038104 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="config-reloader" Apr 24 21:19:09.038281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038109 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="config-reloader" Apr 24 21:19:09.038281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038116 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="kube-rbac-proxy" Apr 24 21:19:09.038281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038121 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="kube-rbac-proxy" Apr 24 21:19:09.038281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038160 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="config-reloader" Apr 24 21:19:09.038281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038168 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="prom-label-proxy" Apr 24 21:19:09.038281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038175 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="133b2c9a-3e65-4237-98e8-b010b89f5025" containerName="console" Apr 24 21:19:09.038281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038180 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="alertmanager" Apr 24 21:19:09.038281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038187 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="kube-rbac-proxy-web" Apr 24 21:19:09.038281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038192 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="kube-rbac-proxy" Apr 24 21:19:09.038281 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.038201 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="99c697a7-950b-4e94-b800-4425d568df3f" containerName="kube-rbac-proxy-metric" Apr 24 21:19:09.042593 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.042577 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.047908 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.047688 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:19:09.047908 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.047730 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:19:09.047908 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.047746 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:19:09.047908 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.047787 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:19:09.047908 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.047746 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:19:09.047908 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.047859 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:19:09.048188 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.048001 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:19:09.048188 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.048014 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-j2f79\"" Apr 24 21:19:09.048188 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.048031 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:19:09.051479 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.051462 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:19:09.055714 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.055693 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:19:09.135678 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.135655 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.135816 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.135685 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.135816 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.135712 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.135816 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.135780 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-config-volume\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.135816 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.135807 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e4bea200-e5a0-4a4d-a216-32f239a14c77-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.136018 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.135836 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4bea200-e5a0-4a4d-a216-32f239a14c77-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.136018 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.135856 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbm7w\" (UniqueName: \"kubernetes.io/projected/e4bea200-e5a0-4a4d-a216-32f239a14c77-kube-api-access-mbm7w\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.136018 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.135877 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.136018 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.135951 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e4bea200-e5a0-4a4d-a216-32f239a14c77-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.136018 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.135992 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e4bea200-e5a0-4a4d-a216-32f239a14c77-config-out\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.136228 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.136042 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4bea200-e5a0-4a4d-a216-32f239a14c77-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.136228 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.136087 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.136228 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.136113 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-web-config\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.236821 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.236749 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbm7w\" (UniqueName: \"kubernetes.io/projected/e4bea200-e5a0-4a4d-a216-32f239a14c77-kube-api-access-mbm7w\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.236821 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.236797 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.236821 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.236815 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e4bea200-e5a0-4a4d-a216-32f239a14c77-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.237026 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.236831 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e4bea200-e5a0-4a4d-a216-32f239a14c77-config-out\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.237026 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.236850 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4bea200-e5a0-4a4d-a216-32f239a14c77-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.237026 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.236881 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.237026 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.236897 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-web-config\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.237026 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.236929 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.237271 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.237060 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.237271 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.237102 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.237271 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.237129 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-config-volume\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.237271 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.237150 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e4bea200-e5a0-4a4d-a216-32f239a14c77-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.237271 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.237179 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4bea200-e5a0-4a4d-a216-32f239a14c77-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.237271 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.237180 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e4bea200-e5a0-4a4d-a216-32f239a14c77-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.238535 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.238227 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4bea200-e5a0-4a4d-a216-32f239a14c77-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.239974 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.239745 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e4bea200-e5a0-4a4d-a216-32f239a14c77-config-out\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.239974 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.239835 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.239974 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.239908 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.240152 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.240013 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-config-volume\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.240152 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.240070 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-web-config\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.240152 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.240112 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.240332 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.240313 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.240371 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.240360 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4bea200-e5a0-4a4d-a216-32f239a14c77-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.240862 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.240847 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e4bea200-e5a0-4a4d-a216-32f239a14c77-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.241611 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.241592 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e4bea200-e5a0-4a4d-a216-32f239a14c77-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.244708 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.244690 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbm7w\" (UniqueName: \"kubernetes.io/projected/e4bea200-e5a0-4a4d-a216-32f239a14c77-kube-api-access-mbm7w\") pod \"alertmanager-main-0\" (UID: \"e4bea200-e5a0-4a4d-a216-32f239a14c77\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.357175 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.357141 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:19:09.478997 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.478961 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:19:09.482352 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:19:09.482326 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4bea200_e5a0_4a4d_a216_32f239a14c77.slice/crio-3e282877aa8975b3ae11726434269c6d5cc077ae57d2cd343fc952eac00e3e8e WatchSource:0}: Error finding container 3e282877aa8975b3ae11726434269c6d5cc077ae57d2cd343fc952eac00e3e8e: Status 404 returned error can't find the container with id 3e282877aa8975b3ae11726434269c6d5cc077ae57d2cd343fc952eac00e3e8e Apr 24 21:19:09.981307 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.981274 2569 generic.go:358] "Generic (PLEG): container finished" podID="e4bea200-e5a0-4a4d-a216-32f239a14c77" containerID="83a026239bd200a459f9447dfd204755b4293e56531897de331f67d325e7471f" exitCode=0 Apr 24 21:19:09.981507 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.981367 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4bea200-e5a0-4a4d-a216-32f239a14c77","Type":"ContainerDied","Data":"83a026239bd200a459f9447dfd204755b4293e56531897de331f67d325e7471f"} Apr 24 21:19:09.981507 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:09.981403 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4bea200-e5a0-4a4d-a216-32f239a14c77","Type":"ContainerStarted","Data":"3e282877aa8975b3ae11726434269c6d5cc077ae57d2cd343fc952eac00e3e8e"} Apr 24 21:19:10.436545 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:10.436511 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99c697a7-950b-4e94-b800-4425d568df3f" path="/var/lib/kubelet/pods/99c697a7-950b-4e94-b800-4425d568df3f/volumes" Apr 24 21:19:10.987844 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:10.987811 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4bea200-e5a0-4a4d-a216-32f239a14c77","Type":"ContainerStarted","Data":"0b5ee513320fcf01d7615a94e08c93b1946389690fdfba8b36500d9ce216c0b2"} Apr 24 21:19:10.987844 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:10.987846 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4bea200-e5a0-4a4d-a216-32f239a14c77","Type":"ContainerStarted","Data":"2341d6e09b5f5e6581695227a0eaeb0aed3b5c8b7d03a084883105be95a30ea6"} Apr 24 21:19:10.988228 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:10.987855 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4bea200-e5a0-4a4d-a216-32f239a14c77","Type":"ContainerStarted","Data":"0cc5ceab6824674b5249432a21634d60d83df6c3697146be652b930db995fea6"} Apr 24 21:19:10.988228 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:10.987864 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4bea200-e5a0-4a4d-a216-32f239a14c77","Type":"ContainerStarted","Data":"88f39242b519de305c74e0b62d8ab557d1db324a083de2c331b6051812d80cb3"} Apr 24 21:19:10.988228 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:10.987872 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4bea200-e5a0-4a4d-a216-32f239a14c77","Type":"ContainerStarted","Data":"10e87a3bb2e38db57c34eedb1c70aaa4aee295adf84d23f120d961ab88c0a40c"} Apr 24 21:19:10.988228 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:10.987881 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4bea200-e5a0-4a4d-a216-32f239a14c77","Type":"ContainerStarted","Data":"62f974509a91e03c5a8c725c1d2c15fe32b4da4d697a9a15fc8f4a4892d95b75"} Apr 24 21:19:11.014887 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.014821 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.014802689 podStartE2EDuration="2.014802689s" podCreationTimestamp="2026-04-24 21:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:19:11.013113816 +0000 UTC m=+139.147512439" watchObservedRunningTime="2026-04-24 21:19:11.014802689 +0000 UTC m=+139.149201312" Apr 24 21:19:11.448615 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.448584 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:19:11.449085 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.448999 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="prometheus" containerID="cri-o://8919244014d0b25bc74dc438777044069de98f92d206c63efe295f48659703e3" gracePeriod=600 Apr 24 21:19:11.449085 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.449016 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="thanos-sidecar" containerID="cri-o://1ceb848336074db6369ae11f058829490a6dcf4f0d64b84749e9aaac85b5c666" gracePeriod=600 Apr 24 21:19:11.449275 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.449067 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="kube-rbac-proxy-thanos" containerID="cri-o://856d5c78599a916446b3c0e57aaece5dce88dcf0ed3b9c57e964c514c7f8a79b" gracePeriod=600 Apr 24 21:19:11.449275 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.449042 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="kube-rbac-proxy-web" containerID="cri-o://ae10121a8a1acbc0574b2c802f9bde68e0c50ce6a79f54a960ec8688a7f985ff" gracePeriod=600 Apr 24 21:19:11.449275 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.448998 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="kube-rbac-proxy" containerID="cri-o://e9c378a7e419b6d57fbaf018451b7a888858089d2112053cca98021b9c28d1e3" gracePeriod=600 Apr 24 21:19:11.449275 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.449074 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="config-reloader" containerID="cri-o://d80f6b97d6b023a57e9e4c8cb8f4cc0c1b5ef13fb49cf698a0df86d1550c40f7" gracePeriod=600 Apr 24 21:19:11.992605 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.992574 2569 generic.go:358] "Generic (PLEG): container finished" podID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerID="856d5c78599a916446b3c0e57aaece5dce88dcf0ed3b9c57e964c514c7f8a79b" exitCode=0 Apr 24 21:19:11.992605 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.992595 2569 generic.go:358] "Generic (PLEG): container finished" podID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerID="e9c378a7e419b6d57fbaf018451b7a888858089d2112053cca98021b9c28d1e3" exitCode=0 Apr 24 21:19:11.992605 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.992603 2569 generic.go:358] "Generic (PLEG): container finished" podID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerID="1ceb848336074db6369ae11f058829490a6dcf4f0d64b84749e9aaac85b5c666" exitCode=0 Apr 24 21:19:11.992605 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.992610 2569 generic.go:358] "Generic (PLEG): container finished" podID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerID="d80f6b97d6b023a57e9e4c8cb8f4cc0c1b5ef13fb49cf698a0df86d1550c40f7" exitCode=0 Apr 24 21:19:11.992605 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.992615 2569 generic.go:358] "Generic (PLEG): container finished" podID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerID="8919244014d0b25bc74dc438777044069de98f92d206c63efe295f48659703e3" exitCode=0 Apr 24 21:19:11.993074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.992648 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"729a228e-b1f6-421c-aefd-99947f19c1fc","Type":"ContainerDied","Data":"856d5c78599a916446b3c0e57aaece5dce88dcf0ed3b9c57e964c514c7f8a79b"} Apr 24 21:19:11.993074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.992682 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"729a228e-b1f6-421c-aefd-99947f19c1fc","Type":"ContainerDied","Data":"e9c378a7e419b6d57fbaf018451b7a888858089d2112053cca98021b9c28d1e3"} Apr 24 21:19:11.993074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.992692 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"729a228e-b1f6-421c-aefd-99947f19c1fc","Type":"ContainerDied","Data":"1ceb848336074db6369ae11f058829490a6dcf4f0d64b84749e9aaac85b5c666"} Apr 24 21:19:11.993074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.992701 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"729a228e-b1f6-421c-aefd-99947f19c1fc","Type":"ContainerDied","Data":"d80f6b97d6b023a57e9e4c8cb8f4cc0c1b5ef13fb49cf698a0df86d1550c40f7"} Apr 24 21:19:11.993074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:11.992711 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"729a228e-b1f6-421c-aefd-99947f19c1fc","Type":"ContainerDied","Data":"8919244014d0b25bc74dc438777044069de98f92d206c63efe295f48659703e3"} Apr 24 21:19:12.683274 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.683254 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:12.764425 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764397 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-trusted-ca-bundle\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764425 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764427 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-config\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764645 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764448 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-tls\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764645 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764465 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764645 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764492 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-kubelet-serving-ca-bundle\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764645 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764541 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/729a228e-b1f6-421c-aefd-99947f19c1fc-tls-assets\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764645 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764570 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-thanos-prometheus-http-client-file\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764645 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764598 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/729a228e-b1f6-421c-aefd-99947f19c1fc-config-out\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764645 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764628 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-metrics-client-ca\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764989 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764670 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-k8s-rulefiles-0\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764989 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764708 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-metrics-client-certs\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764989 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764732 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snvjl\" (UniqueName: \"kubernetes.io/projected/729a228e-b1f6-421c-aefd-99947f19c1fc-kube-api-access-snvjl\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764989 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764784 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-kube-rbac-proxy\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764989 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764813 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-grpc-tls\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764989 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764839 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-serving-certs-ca-bundle\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764989 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764880 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-k8s-db\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764989 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764894 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:12.764989 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764913 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-web-config\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.764989 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.764955 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"729a228e-b1f6-421c-aefd-99947f19c1fc\" (UID: \"729a228e-b1f6-421c-aefd-99947f19c1fc\") " Apr 24 21:19:12.765459 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.765015 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:12.765459 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.765255 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-trusted-ca-bundle\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.765459 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.765277 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.765956 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.765657 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:12.767079 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.767050 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:12.767165 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.767080 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:12.767165 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.767097 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-config" (OuterVolumeSpecName: "config") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:12.767165 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.767144 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/729a228e-b1f6-421c-aefd-99947f19c1fc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:19:12.767928 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.767616 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:12.767928 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.767843 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:12.768074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.767952 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:12.768130 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.768072 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:12.768355 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.768321 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/729a228e-b1f6-421c-aefd-99947f19c1fc-config-out" (OuterVolumeSpecName: "config-out") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:19:12.768649 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.768556 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:19:12.768649 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.768631 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:12.769112 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.769077 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:12.769797 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.769776 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:12.770070 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.770048 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/729a228e-b1f6-421c-aefd-99947f19c1fc-kube-api-access-snvjl" (OuterVolumeSpecName: "kube-api-access-snvjl") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "kube-api-access-snvjl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:19:12.780120 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.780072 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-web-config" (OuterVolumeSpecName: "web-config") pod "729a228e-b1f6-421c-aefd-99947f19c1fc" (UID: "729a228e-b1f6-421c-aefd-99947f19c1fc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:12.865867 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865848 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.865867 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865867 2569 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.865980 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865877 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.865980 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865886 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.865980 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865896 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/729a228e-b1f6-421c-aefd-99947f19c1fc-tls-assets\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.865980 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865905 2569 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-thanos-prometheus-http-client-file\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.865980 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865915 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/729a228e-b1f6-421c-aefd-99947f19c1fc-config-out\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.865980 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865924 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-metrics-client-ca\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.865980 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865934 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.865980 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865942 2569 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-metrics-client-certs\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.865980 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865952 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-snvjl\" (UniqueName: \"kubernetes.io/projected/729a228e-b1f6-421c-aefd-99947f19c1fc-kube-api-access-snvjl\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.865980 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865961 2569 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-kube-rbac-proxy\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.865980 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865970 2569 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-secret-grpc-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.865980 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865977 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729a228e-b1f6-421c-aefd-99947f19c1fc-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.865980 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865985 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/729a228e-b1f6-421c-aefd-99947f19c1fc-prometheus-k8s-db\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.866329 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.865994 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/729a228e-b1f6-421c-aefd-99947f19c1fc-web-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:19:12.997500 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.997471 2569 generic.go:358] "Generic (PLEG): container finished" podID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerID="ae10121a8a1acbc0574b2c802f9bde68e0c50ce6a79f54a960ec8688a7f985ff" exitCode=0 Apr 24 21:19:12.997887 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.997518 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"729a228e-b1f6-421c-aefd-99947f19c1fc","Type":"ContainerDied","Data":"ae10121a8a1acbc0574b2c802f9bde68e0c50ce6a79f54a960ec8688a7f985ff"} Apr 24 21:19:12.997887 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.997541 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"729a228e-b1f6-421c-aefd-99947f19c1fc","Type":"ContainerDied","Data":"e1a29be36495591d58b6583ad7c374b16fae31185673d620b46ada0ed42d9284"} Apr 24 21:19:12.997887 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.997556 2569 scope.go:117] "RemoveContainer" containerID="856d5c78599a916446b3c0e57aaece5dce88dcf0ed3b9c57e964c514c7f8a79b" Apr 24 21:19:12.997887 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:12.997574 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.004636 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.004614 2569 scope.go:117] "RemoveContainer" containerID="e9c378a7e419b6d57fbaf018451b7a888858089d2112053cca98021b9c28d1e3" Apr 24 21:19:13.011144 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.011126 2569 scope.go:117] "RemoveContainer" containerID="ae10121a8a1acbc0574b2c802f9bde68e0c50ce6a79f54a960ec8688a7f985ff" Apr 24 21:19:13.017058 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.017040 2569 scope.go:117] "RemoveContainer" containerID="1ceb848336074db6369ae11f058829490a6dcf4f0d64b84749e9aaac85b5c666" Apr 24 21:19:13.020493 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.020476 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:19:13.025317 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.025291 2569 scope.go:117] "RemoveContainer" containerID="d80f6b97d6b023a57e9e4c8cb8f4cc0c1b5ef13fb49cf698a0df86d1550c40f7" Apr 24 21:19:13.025648 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.025628 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:19:13.031365 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.031351 2569 scope.go:117] "RemoveContainer" containerID="8919244014d0b25bc74dc438777044069de98f92d206c63efe295f48659703e3" Apr 24 21:19:13.037734 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.037719 2569 scope.go:117] "RemoveContainer" containerID="cce2a5d16c58c100ac32248036bae2de098e84ade8ecc35175cafca12d52c2f4" Apr 24 21:19:13.043907 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.043891 2569 scope.go:117] "RemoveContainer" containerID="856d5c78599a916446b3c0e57aaece5dce88dcf0ed3b9c57e964c514c7f8a79b" Apr 24 21:19:13.044138 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:19:13.044120 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"856d5c78599a916446b3c0e57aaece5dce88dcf0ed3b9c57e964c514c7f8a79b\": container with ID starting with 856d5c78599a916446b3c0e57aaece5dce88dcf0ed3b9c57e964c514c7f8a79b not found: ID does not exist" containerID="856d5c78599a916446b3c0e57aaece5dce88dcf0ed3b9c57e964c514c7f8a79b" Apr 24 21:19:13.044188 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.044143 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"856d5c78599a916446b3c0e57aaece5dce88dcf0ed3b9c57e964c514c7f8a79b"} err="failed to get container status \"856d5c78599a916446b3c0e57aaece5dce88dcf0ed3b9c57e964c514c7f8a79b\": rpc error: code = NotFound desc = could not find container \"856d5c78599a916446b3c0e57aaece5dce88dcf0ed3b9c57e964c514c7f8a79b\": container with ID starting with 856d5c78599a916446b3c0e57aaece5dce88dcf0ed3b9c57e964c514c7f8a79b not found: ID does not exist" Apr 24 21:19:13.044188 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.044163 2569 scope.go:117] "RemoveContainer" containerID="e9c378a7e419b6d57fbaf018451b7a888858089d2112053cca98021b9c28d1e3" Apr 24 21:19:13.044381 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:19:13.044356 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c378a7e419b6d57fbaf018451b7a888858089d2112053cca98021b9c28d1e3\": container with ID starting with e9c378a7e419b6d57fbaf018451b7a888858089d2112053cca98021b9c28d1e3 not found: ID does not exist" containerID="e9c378a7e419b6d57fbaf018451b7a888858089d2112053cca98021b9c28d1e3" Apr 24 21:19:13.044421 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.044389 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c378a7e419b6d57fbaf018451b7a888858089d2112053cca98021b9c28d1e3"} err="failed to get container status \"e9c378a7e419b6d57fbaf018451b7a888858089d2112053cca98021b9c28d1e3\": rpc error: code = NotFound desc = could not find container \"e9c378a7e419b6d57fbaf018451b7a888858089d2112053cca98021b9c28d1e3\": container with ID starting with e9c378a7e419b6d57fbaf018451b7a888858089d2112053cca98021b9c28d1e3 not found: ID does not exist" Apr 24 21:19:13.044421 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.044405 2569 scope.go:117] "RemoveContainer" containerID="ae10121a8a1acbc0574b2c802f9bde68e0c50ce6a79f54a960ec8688a7f985ff" Apr 24 21:19:13.044614 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:19:13.044600 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae10121a8a1acbc0574b2c802f9bde68e0c50ce6a79f54a960ec8688a7f985ff\": container with ID starting with ae10121a8a1acbc0574b2c802f9bde68e0c50ce6a79f54a960ec8688a7f985ff not found: ID does not exist" containerID="ae10121a8a1acbc0574b2c802f9bde68e0c50ce6a79f54a960ec8688a7f985ff" Apr 24 21:19:13.044656 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.044617 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae10121a8a1acbc0574b2c802f9bde68e0c50ce6a79f54a960ec8688a7f985ff"} err="failed to get container status \"ae10121a8a1acbc0574b2c802f9bde68e0c50ce6a79f54a960ec8688a7f985ff\": rpc error: code = NotFound desc = could not find container \"ae10121a8a1acbc0574b2c802f9bde68e0c50ce6a79f54a960ec8688a7f985ff\": container with ID starting with ae10121a8a1acbc0574b2c802f9bde68e0c50ce6a79f54a960ec8688a7f985ff not found: ID does not exist" Apr 24 21:19:13.044656 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.044628 2569 scope.go:117] "RemoveContainer" containerID="1ceb848336074db6369ae11f058829490a6dcf4f0d64b84749e9aaac85b5c666" Apr 24 21:19:13.044819 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:19:13.044803 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ceb848336074db6369ae11f058829490a6dcf4f0d64b84749e9aaac85b5c666\": container with ID starting with 1ceb848336074db6369ae11f058829490a6dcf4f0d64b84749e9aaac85b5c666 not found: ID does not exist" containerID="1ceb848336074db6369ae11f058829490a6dcf4f0d64b84749e9aaac85b5c666" Apr 24 21:19:13.044879 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.044822 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ceb848336074db6369ae11f058829490a6dcf4f0d64b84749e9aaac85b5c666"} err="failed to get container status \"1ceb848336074db6369ae11f058829490a6dcf4f0d64b84749e9aaac85b5c666\": rpc error: code = NotFound desc = could not find container \"1ceb848336074db6369ae11f058829490a6dcf4f0d64b84749e9aaac85b5c666\": container with ID starting with 1ceb848336074db6369ae11f058829490a6dcf4f0d64b84749e9aaac85b5c666 not found: ID does not exist" Apr 24 21:19:13.044879 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.044832 2569 scope.go:117] "RemoveContainer" containerID="d80f6b97d6b023a57e9e4c8cb8f4cc0c1b5ef13fb49cf698a0df86d1550c40f7" Apr 24 21:19:13.045064 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:19:13.045047 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d80f6b97d6b023a57e9e4c8cb8f4cc0c1b5ef13fb49cf698a0df86d1550c40f7\": container with ID starting with d80f6b97d6b023a57e9e4c8cb8f4cc0c1b5ef13fb49cf698a0df86d1550c40f7 not found: ID does not exist" containerID="d80f6b97d6b023a57e9e4c8cb8f4cc0c1b5ef13fb49cf698a0df86d1550c40f7" Apr 24 21:19:13.045105 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.045067 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d80f6b97d6b023a57e9e4c8cb8f4cc0c1b5ef13fb49cf698a0df86d1550c40f7"} err="failed to get container status \"d80f6b97d6b023a57e9e4c8cb8f4cc0c1b5ef13fb49cf698a0df86d1550c40f7\": rpc error: code = NotFound desc = could not find container \"d80f6b97d6b023a57e9e4c8cb8f4cc0c1b5ef13fb49cf698a0df86d1550c40f7\": container with ID starting with d80f6b97d6b023a57e9e4c8cb8f4cc0c1b5ef13fb49cf698a0df86d1550c40f7 not found: ID does not exist" Apr 24 21:19:13.045105 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.045079 2569 scope.go:117] "RemoveContainer" containerID="8919244014d0b25bc74dc438777044069de98f92d206c63efe295f48659703e3" Apr 24 21:19:13.045282 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:19:13.045262 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8919244014d0b25bc74dc438777044069de98f92d206c63efe295f48659703e3\": container with ID starting with 8919244014d0b25bc74dc438777044069de98f92d206c63efe295f48659703e3 not found: ID does not exist" containerID="8919244014d0b25bc74dc438777044069de98f92d206c63efe295f48659703e3" Apr 24 21:19:13.045320 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.045286 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8919244014d0b25bc74dc438777044069de98f92d206c63efe295f48659703e3"} err="failed to get container status \"8919244014d0b25bc74dc438777044069de98f92d206c63efe295f48659703e3\": rpc error: code = NotFound desc = could not find container \"8919244014d0b25bc74dc438777044069de98f92d206c63efe295f48659703e3\": container with ID starting with 8919244014d0b25bc74dc438777044069de98f92d206c63efe295f48659703e3 not found: ID does not exist" Apr 24 21:19:13.045320 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.045300 2569 scope.go:117] "RemoveContainer" containerID="cce2a5d16c58c100ac32248036bae2de098e84ade8ecc35175cafca12d52c2f4" Apr 24 21:19:13.045519 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:19:13.045502 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cce2a5d16c58c100ac32248036bae2de098e84ade8ecc35175cafca12d52c2f4\": container with ID starting with cce2a5d16c58c100ac32248036bae2de098e84ade8ecc35175cafca12d52c2f4 not found: ID does not exist" containerID="cce2a5d16c58c100ac32248036bae2de098e84ade8ecc35175cafca12d52c2f4" Apr 24 21:19:13.045560 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.045523 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cce2a5d16c58c100ac32248036bae2de098e84ade8ecc35175cafca12d52c2f4"} err="failed to get container status \"cce2a5d16c58c100ac32248036bae2de098e84ade8ecc35175cafca12d52c2f4\": rpc error: code = NotFound desc = could not find container \"cce2a5d16c58c100ac32248036bae2de098e84ade8ecc35175cafca12d52c2f4\": container with ID starting with cce2a5d16c58c100ac32248036bae2de098e84ade8ecc35175cafca12d52c2f4 not found: ID does not exist" Apr 24 21:19:13.051350 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051323 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:19:13.051633 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051620 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="thanos-sidecar" Apr 24 21:19:13.051685 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051635 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="thanos-sidecar" Apr 24 21:19:13.051685 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051652 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="config-reloader" Apr 24 21:19:13.051685 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051660 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="config-reloader" Apr 24 21:19:13.051685 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051671 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="init-config-reloader" Apr 24 21:19:13.051685 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051678 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="init-config-reloader" Apr 24 21:19:13.051849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051690 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="prometheus" Apr 24 21:19:13.051849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051696 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="prometheus" Apr 24 21:19:13.051849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051704 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="kube-rbac-proxy-web" Apr 24 21:19:13.051849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051709 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="kube-rbac-proxy-web" Apr 24 21:19:13.051849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051714 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="kube-rbac-proxy" Apr 24 21:19:13.051849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051718 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="kube-rbac-proxy" Apr 24 21:19:13.051849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051723 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="kube-rbac-proxy-thanos" Apr 24 21:19:13.051849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051729 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="kube-rbac-proxy-thanos" Apr 24 21:19:13.051849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051791 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="thanos-sidecar" Apr 24 21:19:13.051849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051799 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="kube-rbac-proxy-thanos" Apr 24 21:19:13.051849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051806 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="kube-rbac-proxy-web" Apr 24 21:19:13.051849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051814 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="prometheus" Apr 24 21:19:13.051849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051826 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="config-reloader" Apr 24 21:19:13.051849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.051841 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" containerName="kube-rbac-proxy" Apr 24 21:19:13.056785 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.056744 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.059436 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.059415 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-h2dg2\"" Apr 24 21:19:13.059436 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.059427 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:19:13.059436 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.059435 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:19:13.059689 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.059415 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:19:13.059689 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.059477 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:19:13.059863 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.059693 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:19:13.059863 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.059895 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:19:13.060002 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.059921 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:19:13.060057 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.060040 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:19:13.060198 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.060181 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:19:13.060290 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.060246 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-123athu0f2uqi\"" Apr 24 21:19:13.060408 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.060372 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:19:13.063889 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.063246 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:19:13.069939 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.069922 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:19:13.077457 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.077424 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:19:13.169377 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169355 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c8b31da7-e241-4355-9dcc-31ff04cfdc36-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169479 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169383 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn6ms\" (UniqueName: \"kubernetes.io/projected/c8b31da7-e241-4355-9dcc-31ff04cfdc36-kube-api-access-cn6ms\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169479 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169402 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-config\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169479 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169418 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c8b31da7-e241-4355-9dcc-31ff04cfdc36-config-out\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169580 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169487 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169580 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169533 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169580 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169554 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169704 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169576 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169704 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169608 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b31da7-e241-4355-9dcc-31ff04cfdc36-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169704 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169628 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169704 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169645 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c8b31da7-e241-4355-9dcc-31ff04cfdc36-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169704 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169665 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b31da7-e241-4355-9dcc-31ff04cfdc36-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169704 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169686 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169704 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169701 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c8b31da7-e241-4355-9dcc-31ff04cfdc36-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169919 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169771 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c8b31da7-e241-4355-9dcc-31ff04cfdc36-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169919 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169795 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b31da7-e241-4355-9dcc-31ff04cfdc36-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169919 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169821 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-web-config\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.169919 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.169839 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.270740 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.270711 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c8b31da7-e241-4355-9dcc-31ff04cfdc36-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.270858 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.270742 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b31da7-e241-4355-9dcc-31ff04cfdc36-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.270858 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.270793 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-web-config\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.270858 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.270821 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271006 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.270858 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c8b31da7-e241-4355-9dcc-31ff04cfdc36-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271006 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.270886 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cn6ms\" (UniqueName: \"kubernetes.io/projected/c8b31da7-e241-4355-9dcc-31ff04cfdc36-kube-api-access-cn6ms\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271006 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.270917 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-config\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271006 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.270943 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c8b31da7-e241-4355-9dcc-31ff04cfdc36-config-out\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271006 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.270969 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271006 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.271000 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271268 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.271026 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271268 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.271050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271268 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.271076 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b31da7-e241-4355-9dcc-31ff04cfdc36-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271268 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.271105 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271268 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.271132 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c8b31da7-e241-4355-9dcc-31ff04cfdc36-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271268 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.271165 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b31da7-e241-4355-9dcc-31ff04cfdc36-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271268 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.271200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271268 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.271229 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c8b31da7-e241-4355-9dcc-31ff04cfdc36-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271640 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.271579 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b31da7-e241-4355-9dcc-31ff04cfdc36-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.271640 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.271614 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c8b31da7-e241-4355-9dcc-31ff04cfdc36-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.272110 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.271894 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c8b31da7-e241-4355-9dcc-31ff04cfdc36-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.272711 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.272689 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b31da7-e241-4355-9dcc-31ff04cfdc36-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.274161 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.273983 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-web-config\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.274161 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.273989 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c8b31da7-e241-4355-9dcc-31ff04cfdc36-config-out\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.274161 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.274010 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.274161 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.274066 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.274161 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.274088 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.274464 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.274424 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c8b31da7-e241-4355-9dcc-31ff04cfdc36-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.274682 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.274639 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8b31da7-e241-4355-9dcc-31ff04cfdc36-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.275389 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.275361 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.275634 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.275608 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c8b31da7-e241-4355-9dcc-31ff04cfdc36-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.276111 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.276093 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-config\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.276265 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.276247 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.276391 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.276375 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.276463 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.276443 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c8b31da7-e241-4355-9dcc-31ff04cfdc36-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.278580 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.278559 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn6ms\" (UniqueName: \"kubernetes.io/projected/c8b31da7-e241-4355-9dcc-31ff04cfdc36-kube-api-access-cn6ms\") pod \"prometheus-k8s-0\" (UID: \"c8b31da7-e241-4355-9dcc-31ff04cfdc36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.370430 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.370378 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:13.488865 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:13.488834 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:19:13.491533 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:19:13.491509 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8b31da7_e241_4355_9dcc_31ff04cfdc36.slice/crio-e40950e543cb795c0ba4ca7b7177ecf1377549d0ea0d53b64aeed8a1db519a8b WatchSource:0}: Error finding container e40950e543cb795c0ba4ca7b7177ecf1377549d0ea0d53b64aeed8a1db519a8b: Status 404 returned error can't find the container with id e40950e543cb795c0ba4ca7b7177ecf1377549d0ea0d53b64aeed8a1db519a8b Apr 24 21:19:14.001499 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:14.001467 2569 generic.go:358] "Generic (PLEG): container finished" podID="c8b31da7-e241-4355-9dcc-31ff04cfdc36" containerID="463a88931de46ee19964cc3a9d2348e2668916319462ce22b3130ed42e5cdf8e" exitCode=0 Apr 24 21:19:14.001913 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:14.001562 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c8b31da7-e241-4355-9dcc-31ff04cfdc36","Type":"ContainerDied","Data":"463a88931de46ee19964cc3a9d2348e2668916319462ce22b3130ed42e5cdf8e"} Apr 24 21:19:14.001913 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:14.001598 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c8b31da7-e241-4355-9dcc-31ff04cfdc36","Type":"ContainerStarted","Data":"e40950e543cb795c0ba4ca7b7177ecf1377549d0ea0d53b64aeed8a1db519a8b"} Apr 24 21:19:14.436401 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:14.436369 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="729a228e-b1f6-421c-aefd-99947f19c1fc" path="/var/lib/kubelet/pods/729a228e-b1f6-421c-aefd-99947f19c1fc/volumes" Apr 24 21:19:15.008068 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:15.008034 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c8b31da7-e241-4355-9dcc-31ff04cfdc36","Type":"ContainerStarted","Data":"b16523c63febfb270979db141f10ce4b32f8d25f60138d6849a34abeb6688d0c"} Apr 24 21:19:15.008068 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:15.008066 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c8b31da7-e241-4355-9dcc-31ff04cfdc36","Type":"ContainerStarted","Data":"ea3291ace4d1005af45acb2cf48ef7805a652c9b2f57e5eac06d7feb2191a9bb"} Apr 24 21:19:15.008068 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:15.008077 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c8b31da7-e241-4355-9dcc-31ff04cfdc36","Type":"ContainerStarted","Data":"dce73daa00b58d56747400a8a736fa3eea57ef30ba1abafa2ddae10de8225eb9"} Apr 24 21:19:15.008651 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:15.008086 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c8b31da7-e241-4355-9dcc-31ff04cfdc36","Type":"ContainerStarted","Data":"9f1fa70e24359fc55c45abe5f374ed182dd42d4c892225cdb7d38a1ab106d595"} Apr 24 21:19:15.008651 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:15.008094 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c8b31da7-e241-4355-9dcc-31ff04cfdc36","Type":"ContainerStarted","Data":"e243e097b441e15affbdbcccd3444dde55dcf4cde827663920e9cfd118e8406a"} Apr 24 21:19:15.008651 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:15.008102 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c8b31da7-e241-4355-9dcc-31ff04cfdc36","Type":"ContainerStarted","Data":"7353023eff740905311616071176385d3eefd9a1ce591aa9e21f2dfaa1def3f4"} Apr 24 21:19:15.038928 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:15.038874 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.038856414 podStartE2EDuration="2.038856414s" podCreationTimestamp="2026-04-24 21:19:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:19:15.037011184 +0000 UTC m=+143.171409804" watchObservedRunningTime="2026-04-24 21:19:15.038856414 +0000 UTC m=+143.173255033" Apr 24 21:19:18.371389 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:18.371355 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:54.479983 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:54.479953 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-jj6lg"] Apr 24 21:19:54.484513 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:54.484495 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jj6lg" Apr 24 21:19:54.487009 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:54.486985 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:19:54.492933 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:54.492902 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jj6lg"] Apr 24 21:19:54.589985 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:54.589960 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f98087fe-7185-47dc-a94a-dc47079533a9-kubelet-config\") pod \"global-pull-secret-syncer-jj6lg\" (UID: \"f98087fe-7185-47dc-a94a-dc47079533a9\") " pod="kube-system/global-pull-secret-syncer-jj6lg" Apr 24 21:19:54.590135 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:54.589990 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f98087fe-7185-47dc-a94a-dc47079533a9-original-pull-secret\") pod \"global-pull-secret-syncer-jj6lg\" (UID: \"f98087fe-7185-47dc-a94a-dc47079533a9\") " pod="kube-system/global-pull-secret-syncer-jj6lg" Apr 24 21:19:54.590135 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:54.590023 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f98087fe-7185-47dc-a94a-dc47079533a9-dbus\") pod \"global-pull-secret-syncer-jj6lg\" (UID: \"f98087fe-7185-47dc-a94a-dc47079533a9\") " pod="kube-system/global-pull-secret-syncer-jj6lg" Apr 24 21:19:54.691236 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:54.691205 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f98087fe-7185-47dc-a94a-dc47079533a9-kubelet-config\") pod \"global-pull-secret-syncer-jj6lg\" (UID: \"f98087fe-7185-47dc-a94a-dc47079533a9\") " pod="kube-system/global-pull-secret-syncer-jj6lg" Apr 24 21:19:54.691351 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:54.691240 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f98087fe-7185-47dc-a94a-dc47079533a9-original-pull-secret\") pod \"global-pull-secret-syncer-jj6lg\" (UID: \"f98087fe-7185-47dc-a94a-dc47079533a9\") " pod="kube-system/global-pull-secret-syncer-jj6lg" Apr 24 21:19:54.691351 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:54.691285 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f98087fe-7185-47dc-a94a-dc47079533a9-dbus\") pod \"global-pull-secret-syncer-jj6lg\" (UID: \"f98087fe-7185-47dc-a94a-dc47079533a9\") " pod="kube-system/global-pull-secret-syncer-jj6lg" Apr 24 21:19:54.691351 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:54.691313 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f98087fe-7185-47dc-a94a-dc47079533a9-kubelet-config\") pod \"global-pull-secret-syncer-jj6lg\" (UID: \"f98087fe-7185-47dc-a94a-dc47079533a9\") " pod="kube-system/global-pull-secret-syncer-jj6lg" Apr 24 21:19:54.691493 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:54.691449 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f98087fe-7185-47dc-a94a-dc47079533a9-dbus\") pod \"global-pull-secret-syncer-jj6lg\" (UID: \"f98087fe-7185-47dc-a94a-dc47079533a9\") " pod="kube-system/global-pull-secret-syncer-jj6lg" Apr 24 21:19:54.693576 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:54.693556 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f98087fe-7185-47dc-a94a-dc47079533a9-original-pull-secret\") pod \"global-pull-secret-syncer-jj6lg\" (UID: \"f98087fe-7185-47dc-a94a-dc47079533a9\") " pod="kube-system/global-pull-secret-syncer-jj6lg" Apr 24 21:19:54.794547 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:54.794522 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jj6lg" Apr 24 21:19:54.905666 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:54.905640 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jj6lg"] Apr 24 21:19:54.909994 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:19:54.909962 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf98087fe_7185_47dc_a94a_dc47079533a9.slice/crio-64b732def7f03a635c7479596f12e0947a7d33c673d83595f0287db0607d8f1c WatchSource:0}: Error finding container 64b732def7f03a635c7479596f12e0947a7d33c673d83595f0287db0607d8f1c: Status 404 returned error can't find the container with id 64b732def7f03a635c7479596f12e0947a7d33c673d83595f0287db0607d8f1c Apr 24 21:19:55.125636 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:55.125565 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jj6lg" event={"ID":"f98087fe-7185-47dc-a94a-dc47079533a9","Type":"ContainerStarted","Data":"64b732def7f03a635c7479596f12e0947a7d33c673d83595f0287db0607d8f1c"} Apr 24 21:19:59.138321 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:59.138286 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jj6lg" event={"ID":"f98087fe-7185-47dc-a94a-dc47079533a9","Type":"ContainerStarted","Data":"8e0dfb06882b8941c7dd5c66faa251f2e6dc036c191b1eb7923329f8c62cbb3c"} Apr 24 21:19:59.155011 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:19:59.154963 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-jj6lg" podStartSLOduration=1.6202033 podStartE2EDuration="5.154948533s" podCreationTimestamp="2026-04-24 21:19:54 +0000 UTC" firstStartedPulling="2026-04-24 21:19:54.911594122 +0000 UTC m=+183.045992722" lastFinishedPulling="2026-04-24 21:19:58.446339339 +0000 UTC m=+186.580737955" observedRunningTime="2026-04-24 21:19:59.153698704 +0000 UTC m=+187.288097325" watchObservedRunningTime="2026-04-24 21:19:59.154948533 +0000 UTC m=+187.289347150" Apr 24 21:20:13.371132 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:20:13.371093 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:20:13.386744 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:20:13.386711 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:20:14.197689 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:20:14.197665 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:22:59.012471 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:22:59.012440 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-gs7ns"] Apr 24 21:22:59.015563 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:22:59.015543 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-gs7ns" Apr 24 21:22:59.018176 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:22:59.018155 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:22:59.018301 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:22:59.018286 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:22:59.018415 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:22:59.018400 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 21:22:59.019436 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:22:59.019417 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-xmwjr\"" Apr 24 21:22:59.023307 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:22:59.023283 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-gs7ns"] Apr 24 21:22:59.100983 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:22:59.100955 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a228f5a-483e-497b-a305-5d8f68b5ed9f-cert\") pod \"odh-model-controller-696fc77849-gs7ns\" (UID: \"0a228f5a-483e-497b-a305-5d8f68b5ed9f\") " pod="kserve/odh-model-controller-696fc77849-gs7ns" Apr 24 21:22:59.101239 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:22:59.101210 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7rsq\" (UniqueName: \"kubernetes.io/projected/0a228f5a-483e-497b-a305-5d8f68b5ed9f-kube-api-access-g7rsq\") pod \"odh-model-controller-696fc77849-gs7ns\" (UID: \"0a228f5a-483e-497b-a305-5d8f68b5ed9f\") " pod="kserve/odh-model-controller-696fc77849-gs7ns" Apr 24 21:22:59.202242 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:22:59.202213 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a228f5a-483e-497b-a305-5d8f68b5ed9f-cert\") pod \"odh-model-controller-696fc77849-gs7ns\" (UID: \"0a228f5a-483e-497b-a305-5d8f68b5ed9f\") " pod="kserve/odh-model-controller-696fc77849-gs7ns" Apr 24 21:22:59.202415 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:22:59.202252 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7rsq\" (UniqueName: \"kubernetes.io/projected/0a228f5a-483e-497b-a305-5d8f68b5ed9f-kube-api-access-g7rsq\") pod \"odh-model-controller-696fc77849-gs7ns\" (UID: \"0a228f5a-483e-497b-a305-5d8f68b5ed9f\") " pod="kserve/odh-model-controller-696fc77849-gs7ns" Apr 24 21:22:59.202415 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:22:59.202371 2569 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 24 21:22:59.202534 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:22:59.202442 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a228f5a-483e-497b-a305-5d8f68b5ed9f-cert podName:0a228f5a-483e-497b-a305-5d8f68b5ed9f nodeName:}" failed. No retries permitted until 2026-04-24 21:22:59.702418389 +0000 UTC m=+367.836817001 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a228f5a-483e-497b-a305-5d8f68b5ed9f-cert") pod "odh-model-controller-696fc77849-gs7ns" (UID: "0a228f5a-483e-497b-a305-5d8f68b5ed9f") : secret "odh-model-controller-webhook-cert" not found Apr 24 21:22:59.213628 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:22:59.213603 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7rsq\" (UniqueName: \"kubernetes.io/projected/0a228f5a-483e-497b-a305-5d8f68b5ed9f-kube-api-access-g7rsq\") pod \"odh-model-controller-696fc77849-gs7ns\" (UID: \"0a228f5a-483e-497b-a305-5d8f68b5ed9f\") " pod="kserve/odh-model-controller-696fc77849-gs7ns" Apr 24 21:22:59.706881 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:22:59.706838 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a228f5a-483e-497b-a305-5d8f68b5ed9f-cert\") pod \"odh-model-controller-696fc77849-gs7ns\" (UID: \"0a228f5a-483e-497b-a305-5d8f68b5ed9f\") " pod="kserve/odh-model-controller-696fc77849-gs7ns" Apr 24 21:22:59.709109 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:22:59.709089 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a228f5a-483e-497b-a305-5d8f68b5ed9f-cert\") pod \"odh-model-controller-696fc77849-gs7ns\" (UID: \"0a228f5a-483e-497b-a305-5d8f68b5ed9f\") " pod="kserve/odh-model-controller-696fc77849-gs7ns" Apr 24 21:22:59.926581 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:22:59.926547 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-gs7ns" Apr 24 21:23:00.044586 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:00.044565 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-gs7ns"] Apr 24 21:23:00.046689 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:23:00.046661 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a228f5a_483e_497b_a305_5d8f68b5ed9f.slice/crio-ff29d91bfe47727b055f340c4ddb1855ec2a15bc02f60ca314633b42e0e3973a WatchSource:0}: Error finding container ff29d91bfe47727b055f340c4ddb1855ec2a15bc02f60ca314633b42e0e3973a: Status 404 returned error can't find the container with id ff29d91bfe47727b055f340c4ddb1855ec2a15bc02f60ca314633b42e0e3973a Apr 24 21:23:00.048323 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:00.048307 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:23:00.639218 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:00.639179 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-gs7ns" event={"ID":"0a228f5a-483e-497b-a305-5d8f68b5ed9f","Type":"ContainerStarted","Data":"ff29d91bfe47727b055f340c4ddb1855ec2a15bc02f60ca314633b42e0e3973a"} Apr 24 21:23:03.649729 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:03.649687 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-gs7ns" event={"ID":"0a228f5a-483e-497b-a305-5d8f68b5ed9f","Type":"ContainerStarted","Data":"f3937ce467157c52d3ea4dce5cde3c3bfc2e4f1b2ffa4bebaa731a75c14833c4"} Apr 24 21:23:03.650118 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:03.649821 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-gs7ns" Apr 24 21:23:03.666861 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:03.666817 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-gs7ns" podStartSLOduration=2.747610945 podStartE2EDuration="5.666804699s" podCreationTimestamp="2026-04-24 21:22:58 +0000 UTC" firstStartedPulling="2026-04-24 21:23:00.04842556 +0000 UTC m=+368.182824159" lastFinishedPulling="2026-04-24 21:23:02.9676193 +0000 UTC m=+371.102017913" observedRunningTime="2026-04-24 21:23:03.665778313 +0000 UTC m=+371.800176931" watchObservedRunningTime="2026-04-24 21:23:03.666804699 +0000 UTC m=+371.801203321" Apr 24 21:23:14.655113 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:14.655084 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-gs7ns" Apr 24 21:23:15.484861 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:15.484830 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-vzmdl"] Apr 24 21:23:15.487915 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:15.487899 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-vzmdl" Apr 24 21:23:15.490483 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:15.490461 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:23:15.490627 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:15.490536 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-ljxsr\"" Apr 24 21:23:15.495497 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:15.495478 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-vzmdl"] Apr 24 21:23:15.531737 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:15.531715 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gvfj\" (UniqueName: \"kubernetes.io/projected/219f70c0-46a5-4c30-9da0-a5a37e49260c-kube-api-access-6gvfj\") pod \"s3-init-vzmdl\" (UID: \"219f70c0-46a5-4c30-9da0-a5a37e49260c\") " pod="kserve/s3-init-vzmdl" Apr 24 21:23:15.632430 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:15.632403 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gvfj\" (UniqueName: \"kubernetes.io/projected/219f70c0-46a5-4c30-9da0-a5a37e49260c-kube-api-access-6gvfj\") pod \"s3-init-vzmdl\" (UID: \"219f70c0-46a5-4c30-9da0-a5a37e49260c\") " pod="kserve/s3-init-vzmdl" Apr 24 21:23:15.642118 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:15.642097 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gvfj\" (UniqueName: \"kubernetes.io/projected/219f70c0-46a5-4c30-9da0-a5a37e49260c-kube-api-access-6gvfj\") pod \"s3-init-vzmdl\" (UID: \"219f70c0-46a5-4c30-9da0-a5a37e49260c\") " pod="kserve/s3-init-vzmdl" Apr 24 21:23:15.806859 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:15.806820 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-vzmdl" Apr 24 21:23:15.920016 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:15.919986 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-vzmdl"] Apr 24 21:23:15.923462 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:23:15.923429 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod219f70c0_46a5_4c30_9da0_a5a37e49260c.slice/crio-de8efe94c271514b13540fc0a91efd972ec8c08724895269f190a93b2002dd02 WatchSource:0}: Error finding container de8efe94c271514b13540fc0a91efd972ec8c08724895269f190a93b2002dd02: Status 404 returned error can't find the container with id de8efe94c271514b13540fc0a91efd972ec8c08724895269f190a93b2002dd02 Apr 24 21:23:16.690209 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:16.690173 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-vzmdl" event={"ID":"219f70c0-46a5-4c30-9da0-a5a37e49260c","Type":"ContainerStarted","Data":"de8efe94c271514b13540fc0a91efd972ec8c08724895269f190a93b2002dd02"} Apr 24 21:23:20.703870 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:20.703784 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-vzmdl" event={"ID":"219f70c0-46a5-4c30-9da0-a5a37e49260c","Type":"ContainerStarted","Data":"fd276917ba503f6c40a1e7766df34860536af81231719087eef649c0f3b06e35"} Apr 24 21:23:20.721603 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:20.721561 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-vzmdl" podStartSLOduration=1.348783274 podStartE2EDuration="5.721548199s" podCreationTimestamp="2026-04-24 21:23:15 +0000 UTC" firstStartedPulling="2026-04-24 21:23:15.925228412 +0000 UTC m=+384.059627021" lastFinishedPulling="2026-04-24 21:23:20.297993341 +0000 UTC m=+388.432391946" observedRunningTime="2026-04-24 21:23:20.720465662 +0000 UTC m=+388.854864295" watchObservedRunningTime="2026-04-24 21:23:20.721548199 +0000 UTC m=+388.855946820" Apr 24 21:23:23.713155 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:23.713122 2569 generic.go:358] "Generic (PLEG): container finished" podID="219f70c0-46a5-4c30-9da0-a5a37e49260c" containerID="fd276917ba503f6c40a1e7766df34860536af81231719087eef649c0f3b06e35" exitCode=0 Apr 24 21:23:23.713511 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:23.713202 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-vzmdl" event={"ID":"219f70c0-46a5-4c30-9da0-a5a37e49260c","Type":"ContainerDied","Data":"fd276917ba503f6c40a1e7766df34860536af81231719087eef649c0f3b06e35"} Apr 24 21:23:24.841698 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:24.841678 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-vzmdl" Apr 24 21:23:24.916892 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:24.916864 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gvfj\" (UniqueName: \"kubernetes.io/projected/219f70c0-46a5-4c30-9da0-a5a37e49260c-kube-api-access-6gvfj\") pod \"219f70c0-46a5-4c30-9da0-a5a37e49260c\" (UID: \"219f70c0-46a5-4c30-9da0-a5a37e49260c\") " Apr 24 21:23:24.918908 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:24.918883 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219f70c0-46a5-4c30-9da0-a5a37e49260c-kube-api-access-6gvfj" (OuterVolumeSpecName: "kube-api-access-6gvfj") pod "219f70c0-46a5-4c30-9da0-a5a37e49260c" (UID: "219f70c0-46a5-4c30-9da0-a5a37e49260c"). InnerVolumeSpecName "kube-api-access-6gvfj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:23:25.017415 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:25.017347 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6gvfj\" (UniqueName: \"kubernetes.io/projected/219f70c0-46a5-4c30-9da0-a5a37e49260c-kube-api-access-6gvfj\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:23:25.720110 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:25.720078 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-vzmdl" event={"ID":"219f70c0-46a5-4c30-9da0-a5a37e49260c","Type":"ContainerDied","Data":"de8efe94c271514b13540fc0a91efd972ec8c08724895269f190a93b2002dd02"} Apr 24 21:23:25.720110 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:25.720110 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de8efe94c271514b13540fc0a91efd972ec8c08724895269f190a93b2002dd02" Apr 24 21:23:25.720110 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:25.720090 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-vzmdl" Apr 24 21:23:34.975572 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:34.975513 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j"] Apr 24 21:23:34.976074 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:34.976055 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="219f70c0-46a5-4c30-9da0-a5a37e49260c" containerName="s3-init" Apr 24 21:23:34.976151 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:34.976077 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="219f70c0-46a5-4c30-9da0-a5a37e49260c" containerName="s3-init" Apr 24 21:23:34.976203 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:34.976176 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="219f70c0-46a5-4c30-9da0-a5a37e49260c" containerName="s3-init" Apr 24 21:23:34.978662 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:34.978643 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:23:34.981529 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:34.981510 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-3062b-predictor-serving-cert\"" Apr 24 21:23:34.981618 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:34.981577 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-3062b-kube-rbac-proxy-sar-config\"" Apr 24 21:23:34.981618 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:34.981585 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:23:34.981740 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:34.981680 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4gsmf\"" Apr 24 21:23:34.981740 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:34.981707 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:23:34.989605 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:34.989586 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j"] Apr 24 21:23:35.093615 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:35.093580 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thw8h\" (UniqueName: \"kubernetes.io/projected/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-kube-api-access-thw8h\") pod \"isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:23:35.093741 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:35.093625 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:23:35.093741 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:35.093649 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-proxy-tls\") pod \"isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:23:35.093741 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:35.093729 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-raw-sklearn-batcher-3062b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-isvc-raw-sklearn-batcher-3062b-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:23:35.194897 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:35.194874 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thw8h\" (UniqueName: \"kubernetes.io/projected/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-kube-api-access-thw8h\") pod \"isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:23:35.195026 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:35.194930 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:23:35.195026 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:35.194961 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-proxy-tls\") pod \"isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:23:35.195026 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:35.195003 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-raw-sklearn-batcher-3062b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-isvc-raw-sklearn-batcher-3062b-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:23:35.195451 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:35.195429 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:23:35.195730 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:35.195710 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-raw-sklearn-batcher-3062b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-isvc-raw-sklearn-batcher-3062b-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:23:35.197416 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:35.197395 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-proxy-tls\") pod \"isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:23:35.204857 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:35.204837 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thw8h\" (UniqueName: \"kubernetes.io/projected/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-kube-api-access-thw8h\") pod \"isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:23:35.289240 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:35.289218 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:23:35.414783 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:35.414741 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j"] Apr 24 21:23:35.417384 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:23:35.417351 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd42a2b82_77ac_4ab0_9ab3_c2f5ea69b8be.slice/crio-2319e9ff414b625f1a7284a38b2c75a6fde3e23899c9b3cfbd4ca205aa1bfe91 WatchSource:0}: Error finding container 2319e9ff414b625f1a7284a38b2c75a6fde3e23899c9b3cfbd4ca205aa1bfe91: Status 404 returned error can't find the container with id 2319e9ff414b625f1a7284a38b2c75a6fde3e23899c9b3cfbd4ca205aa1bfe91 Apr 24 21:23:35.748344 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:35.748258 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" event={"ID":"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be","Type":"ContainerStarted","Data":"2319e9ff414b625f1a7284a38b2c75a6fde3e23899c9b3cfbd4ca205aa1bfe91"} Apr 24 21:23:39.761400 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:39.761363 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" event={"ID":"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be","Type":"ContainerStarted","Data":"140915271a4ecd672170fe9c6ffdd1c9045f1d0fe16b155be6140718b9d79005"} Apr 24 21:23:43.774574 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:43.774542 2569 generic.go:358] "Generic (PLEG): container finished" podID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerID="140915271a4ecd672170fe9c6ffdd1c9045f1d0fe16b155be6140718b9d79005" exitCode=0 Apr 24 21:23:43.774957 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:43.774611 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" event={"ID":"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be","Type":"ContainerDied","Data":"140915271a4ecd672170fe9c6ffdd1c9045f1d0fe16b155be6140718b9d79005"} Apr 24 21:23:57.822970 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:57.822932 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" event={"ID":"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be","Type":"ContainerStarted","Data":"a3345d69bd9119018af26bdcf33ccef6afb131a8b7acb4851c5aae4f9b772fb6"} Apr 24 21:23:59.831153 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:23:59.831120 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" event={"ID":"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be","Type":"ContainerStarted","Data":"35437263e90a3a612c73f29abef75a47d190b4d7b923e62855af79949edb45ac"} Apr 24 21:24:02.844945 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:02.844901 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" event={"ID":"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be","Type":"ContainerStarted","Data":"eee7a73b69e856087e8c339213c53522e09071c8c2481e8e440f6ebfae3582b5"} Apr 24 21:24:02.845379 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:02.845316 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:24:02.845379 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:02.845347 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:24:02.845689 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:02.845668 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:24:02.846839 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:02.846811 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:24:02.847547 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:02.847523 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:24:02.867392 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:02.867220 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podStartSLOduration=1.959065772 podStartE2EDuration="28.86720813s" podCreationTimestamp="2026-04-24 21:23:34 +0000 UTC" firstStartedPulling="2026-04-24 21:23:35.419324653 +0000 UTC m=+403.553723266" lastFinishedPulling="2026-04-24 21:24:02.327467007 +0000 UTC m=+430.461865624" observedRunningTime="2026-04-24 21:24:02.866729994 +0000 UTC m=+431.001128616" watchObservedRunningTime="2026-04-24 21:24:02.86720813 +0000 UTC m=+431.001606754" Apr 24 21:24:03.848978 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:03.848922 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:24:03.849459 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:03.849437 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:24:04.851809 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:04.851750 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:24:04.852298 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:04.852270 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:24:04.855396 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:04.855378 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:24:05.854483 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:05.854443 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:24:05.854877 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:05.854851 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:24:15.854853 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:15.854796 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:24:15.855316 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:15.855295 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:24:25.854509 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:25.854459 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:24:25.854924 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:25.854896 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:24:35.855300 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:35.855248 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:24:35.855824 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:35.855629 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:24:45.855058 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:45.855009 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:24:45.855563 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:45.855539 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:24:55.854431 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:55.854388 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:24:55.854820 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:24:55.854797 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:25:05.855483 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:05.855454 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:25:05.855939 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:05.855513 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:25:20.177516 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.177479 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j"] Apr 24 21:25:20.178105 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.178054 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" containerID="cri-o://a3345d69bd9119018af26bdcf33ccef6afb131a8b7acb4851c5aae4f9b772fb6" gracePeriod=30 Apr 24 21:25:20.178344 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.178298 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kube-rbac-proxy" containerID="cri-o://35437263e90a3a612c73f29abef75a47d190b4d7b923e62855af79949edb45ac" gracePeriod=30 Apr 24 21:25:20.178571 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.178519 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" containerID="cri-o://eee7a73b69e856087e8c339213c53522e09071c8c2481e8e440f6ebfae3582b5" gracePeriod=30 Apr 24 21:25:20.338700 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.338667 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn"] Apr 24 21:25:20.342413 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.342391 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:20.345731 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.345707 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-da412-kube-rbac-proxy-sar-config\"" Apr 24 21:25:20.345866 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.345835 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-da412-predictor-serving-cert\"" Apr 24 21:25:20.356954 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.356931 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn"] Apr 24 21:25:20.419104 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.419084 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl"] Apr 24 21:25:20.422440 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.422425 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:20.424984 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.424968 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-da412-predictor-serving-cert\"" Apr 24 21:25:20.425066 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.424992 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-da412-kube-rbac-proxy-sar-config\"" Apr 24 21:25:20.431721 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.431696 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl"] Apr 24 21:25:20.490587 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.490553 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9vcl\" (UniqueName: \"kubernetes.io/projected/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-kube-api-access-l9vcl\") pod \"isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:20.490736 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.490596 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:20.490736 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.490681 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-proxy-tls\") pod \"isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:20.490852 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.490789 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-da412-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-isvc-sklearn-graph-raw-da412-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:20.591839 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.591810 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:20.591979 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.591850 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a347c45e-1f6f-4252-a817-9390a6dd3ec5-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:20.591979 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.591887 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-proxy-tls\") pod \"isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:20.591979 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.591944 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a347c45e-1f6f-4252-a817-9390a6dd3ec5-proxy-tls\") pod \"isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:20.592143 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.591990 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmsxx\" (UniqueName: \"kubernetes.io/projected/a347c45e-1f6f-4252-a817-9390a6dd3ec5-kube-api-access-nmsxx\") pod \"isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:20.592143 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.592022 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-da412-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-isvc-sklearn-graph-raw-da412-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:20.592143 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.592086 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-da412-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a347c45e-1f6f-4252-a817-9390a6dd3ec5-isvc-xgboost-graph-raw-da412-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:20.592349 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.592153 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9vcl\" (UniqueName: \"kubernetes.io/projected/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-kube-api-access-l9vcl\") pod \"isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:20.592403 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.592348 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:20.592783 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.592738 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-da412-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-isvc-sklearn-graph-raw-da412-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:20.594359 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.594338 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-proxy-tls\") pod \"isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:20.601478 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.601453 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9vcl\" (UniqueName: \"kubernetes.io/projected/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-kube-api-access-l9vcl\") pod \"isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:20.653297 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.653269 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:20.693219 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.693141 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a347c45e-1f6f-4252-a817-9390a6dd3ec5-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:20.693329 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.693247 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a347c45e-1f6f-4252-a817-9390a6dd3ec5-proxy-tls\") pod \"isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:20.693329 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.693297 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmsxx\" (UniqueName: \"kubernetes.io/projected/a347c45e-1f6f-4252-a817-9390a6dd3ec5-kube-api-access-nmsxx\") pod \"isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:20.693426 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.693357 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-da412-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a347c45e-1f6f-4252-a817-9390a6dd3ec5-isvc-xgboost-graph-raw-da412-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:20.693477 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.693419 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a347c45e-1f6f-4252-a817-9390a6dd3ec5-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:20.694093 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.694067 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-da412-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a347c45e-1f6f-4252-a817-9390a6dd3ec5-isvc-xgboost-graph-raw-da412-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:20.695514 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.695493 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a347c45e-1f6f-4252-a817-9390a6dd3ec5-proxy-tls\") pod \"isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:20.707504 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.707474 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmsxx\" (UniqueName: \"kubernetes.io/projected/a347c45e-1f6f-4252-a817-9390a6dd3ec5-kube-api-access-nmsxx\") pod \"isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:20.733007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.732967 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:20.779292 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.779264 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn"] Apr 24 21:25:20.781831 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:25:20.781794 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd35a9f7d_8a52_4cc0_a507_a704d6a2a614.slice/crio-4b45ede71b92970f111c876798f93ba51a8580e64e30838bb3c0673969c408e7 WatchSource:0}: Error finding container 4b45ede71b92970f111c876798f93ba51a8580e64e30838bb3c0673969c408e7: Status 404 returned error can't find the container with id 4b45ede71b92970f111c876798f93ba51a8580e64e30838bb3c0673969c408e7 Apr 24 21:25:20.870043 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:20.870011 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl"] Apr 24 21:25:20.873431 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:25:20.873385 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda347c45e_1f6f_4252_a817_9390a6dd3ec5.slice/crio-5ffc36c0cfcbea10bd684491d954ec2e85d10e29164b02f4283d798aa40e62f5 WatchSource:0}: Error finding container 5ffc36c0cfcbea10bd684491d954ec2e85d10e29164b02f4283d798aa40e62f5: Status 404 returned error can't find the container with id 5ffc36c0cfcbea10bd684491d954ec2e85d10e29164b02f4283d798aa40e62f5 Apr 24 21:25:21.072711 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:21.072677 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" event={"ID":"a347c45e-1f6f-4252-a817-9390a6dd3ec5","Type":"ContainerStarted","Data":"2198c9eb723cc400708ec5fba3ef0ea5130f9d76cf009bdc59a75f7c320705b0"} Apr 24 21:25:21.072925 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:21.072717 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" event={"ID":"a347c45e-1f6f-4252-a817-9390a6dd3ec5","Type":"ContainerStarted","Data":"5ffc36c0cfcbea10bd684491d954ec2e85d10e29164b02f4283d798aa40e62f5"} Apr 24 21:25:21.074176 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:21.074149 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" event={"ID":"d35a9f7d-8a52-4cc0-a507-a704d6a2a614","Type":"ContainerStarted","Data":"c5b5bab40dbfbe503019af752df7f9144147a6769e71b6ccbdb8bcab6e3ad439"} Apr 24 21:25:21.074279 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:21.074180 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" event={"ID":"d35a9f7d-8a52-4cc0-a507-a704d6a2a614","Type":"ContainerStarted","Data":"4b45ede71b92970f111c876798f93ba51a8580e64e30838bb3c0673969c408e7"} Apr 24 21:25:21.076395 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:21.076372 2569 generic.go:358] "Generic (PLEG): container finished" podID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerID="35437263e90a3a612c73f29abef75a47d190b4d7b923e62855af79949edb45ac" exitCode=2 Apr 24 21:25:21.076486 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:21.076425 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" event={"ID":"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be","Type":"ContainerDied","Data":"35437263e90a3a612c73f29abef75a47d190b4d7b923e62855af79949edb45ac"} Apr 24 21:25:24.852777 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:24.852726 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 21:25:25.090512 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:25.090479 2569 generic.go:358] "Generic (PLEG): container finished" podID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerID="2198c9eb723cc400708ec5fba3ef0ea5130f9d76cf009bdc59a75f7c320705b0" exitCode=0 Apr 24 21:25:25.090686 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:25.090560 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" event={"ID":"a347c45e-1f6f-4252-a817-9390a6dd3ec5","Type":"ContainerDied","Data":"2198c9eb723cc400708ec5fba3ef0ea5130f9d76cf009bdc59a75f7c320705b0"} Apr 24 21:25:25.091960 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:25.091939 2569 generic.go:358] "Generic (PLEG): container finished" podID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerID="c5b5bab40dbfbe503019af752df7f9144147a6769e71b6ccbdb8bcab6e3ad439" exitCode=0 Apr 24 21:25:25.092063 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:25.092008 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" event={"ID":"d35a9f7d-8a52-4cc0-a507-a704d6a2a614","Type":"ContainerDied","Data":"c5b5bab40dbfbe503019af752df7f9144147a6769e71b6ccbdb8bcab6e3ad439"} Apr 24 21:25:25.094846 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:25.094825 2569 generic.go:358] "Generic (PLEG): container finished" podID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerID="a3345d69bd9119018af26bdcf33ccef6afb131a8b7acb4851c5aae4f9b772fb6" exitCode=0 Apr 24 21:25:25.094928 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:25.094912 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" event={"ID":"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be","Type":"ContainerDied","Data":"a3345d69bd9119018af26bdcf33ccef6afb131a8b7acb4851c5aae4f9b772fb6"} Apr 24 21:25:25.854979 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:25.854935 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:25:25.855620 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:25.855433 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:25:26.103108 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:26.102851 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" event={"ID":"d35a9f7d-8a52-4cc0-a507-a704d6a2a614","Type":"ContainerStarted","Data":"e10da51d8e9bda31b6949ea5907ca6b10f1add44f2efb9acf8d91c7c5cfc1118"} Apr 24 21:25:26.103108 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:26.102929 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" event={"ID":"d35a9f7d-8a52-4cc0-a507-a704d6a2a614","Type":"ContainerStarted","Data":"bcf8bbada951118bd8a84283dec246b22772dd30bcfd32bd4d2572b779ee6d3a"} Apr 24 21:25:26.103673 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:26.103648 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:26.103673 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:26.103678 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:26.105132 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:26.105058 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:25:26.125102 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:26.124992 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" podStartSLOduration=6.12497254 podStartE2EDuration="6.12497254s" podCreationTimestamp="2026-04-24 21:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:25:26.12390634 +0000 UTC m=+514.258304978" watchObservedRunningTime="2026-04-24 21:25:26.12497254 +0000 UTC m=+514.259371162" Apr 24 21:25:27.107502 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:27.107070 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:25:29.852189 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:29.852151 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 21:25:32.111700 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:32.111673 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:25:32.112223 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:32.112196 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:25:34.852538 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:34.852492 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 21:25:34.852978 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:34.852642 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:25:35.854555 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:35.854511 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:25:35.854967 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:35.854894 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:25:39.852250 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:39.852200 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 21:25:42.112595 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:42.112499 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:25:44.165541 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:44.165461 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" event={"ID":"a347c45e-1f6f-4252-a817-9390a6dd3ec5","Type":"ContainerStarted","Data":"2962436ecead1cf1306c65a787423f9e521c89f2f8af5df67d648af0f9e62ada"} Apr 24 21:25:44.165541 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:44.165501 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" event={"ID":"a347c45e-1f6f-4252-a817-9390a6dd3ec5","Type":"ContainerStarted","Data":"90ec6e1ca3ac6acc4dd3c2a32f8b9be9fd9e457c21117aed53daef493ccc7d59"} Apr 24 21:25:44.165949 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:44.165849 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:44.186989 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:44.186945 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" podStartSLOduration=5.441760816 podStartE2EDuration="24.186930288s" podCreationTimestamp="2026-04-24 21:25:20 +0000 UTC" firstStartedPulling="2026-04-24 21:25:25.092005848 +0000 UTC m=+513.226404446" lastFinishedPulling="2026-04-24 21:25:43.837175319 +0000 UTC m=+531.971573918" observedRunningTime="2026-04-24 21:25:44.184403343 +0000 UTC m=+532.318801968" watchObservedRunningTime="2026-04-24 21:25:44.186930288 +0000 UTC m=+532.321328910" Apr 24 21:25:44.852370 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:44.852322 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 21:25:45.168800 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:45.168706 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:45.169862 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:45.169836 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:25:45.855298 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:45.855253 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 21:25:45.855479 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:45.855403 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:25:45.855629 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:45.855592 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:25:45.855741 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:45.855702 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:25:46.171444 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:46.171354 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:25:49.851982 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:49.851938 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 21:25:50.839679 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:50.839657 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:25:50.974743 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:50.974664 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thw8h\" (UniqueName: \"kubernetes.io/projected/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-kube-api-access-thw8h\") pod \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " Apr 24 21:25:50.974743 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:50.974701 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-kserve-provision-location\") pod \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " Apr 24 21:25:50.975162 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:50.974801 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-raw-sklearn-batcher-3062b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-isvc-raw-sklearn-batcher-3062b-kube-rbac-proxy-sar-config\") pod \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " Apr 24 21:25:50.975162 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:50.974822 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-proxy-tls\") pod \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\" (UID: \"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be\") " Apr 24 21:25:50.975162 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:50.975117 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" (UID: "d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:25:50.975162 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:50.975134 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-isvc-raw-sklearn-batcher-3062b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-raw-sklearn-batcher-3062b-kube-rbac-proxy-sar-config") pod "d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" (UID: "d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be"). InnerVolumeSpecName "isvc-raw-sklearn-batcher-3062b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:25:50.976914 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:50.976889 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" (UID: "d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:25:50.977042 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:50.977024 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-kube-api-access-thw8h" (OuterVolumeSpecName: "kube-api-access-thw8h") pod "d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" (UID: "d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be"). InnerVolumeSpecName "kube-api-access-thw8h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:25:51.075552 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.075523 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-raw-sklearn-batcher-3062b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-isvc-raw-sklearn-batcher-3062b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:25:51.075552 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.075546 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-proxy-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:25:51.075552 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.075556 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-thw8h\" (UniqueName: \"kubernetes.io/projected/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-kube-api-access-thw8h\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:25:51.075739 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.075566 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be-kserve-provision-location\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:25:51.176071 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.176043 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:25:51.176517 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.176491 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:25:51.187917 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.187893 2569 generic.go:358] "Generic (PLEG): container finished" podID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerID="eee7a73b69e856087e8c339213c53522e09071c8c2481e8e440f6ebfae3582b5" exitCode=0 Apr 24 21:25:51.188021 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.187957 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" event={"ID":"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be","Type":"ContainerDied","Data":"eee7a73b69e856087e8c339213c53522e09071c8c2481e8e440f6ebfae3582b5"} Apr 24 21:25:51.188021 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.187980 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" Apr 24 21:25:51.188021 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.187994 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j" event={"ID":"d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be","Type":"ContainerDied","Data":"2319e9ff414b625f1a7284a38b2c75a6fde3e23899c9b3cfbd4ca205aa1bfe91"} Apr 24 21:25:51.188021 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.188009 2569 scope.go:117] "RemoveContainer" containerID="eee7a73b69e856087e8c339213c53522e09071c8c2481e8e440f6ebfae3582b5" Apr 24 21:25:51.203435 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.203414 2569 scope.go:117] "RemoveContainer" containerID="35437263e90a3a612c73f29abef75a47d190b4d7b923e62855af79949edb45ac" Apr 24 21:25:51.210696 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.210678 2569 scope.go:117] "RemoveContainer" containerID="a3345d69bd9119018af26bdcf33ccef6afb131a8b7acb4851c5aae4f9b772fb6" Apr 24 21:25:51.215448 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.215429 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j"] Apr 24 21:25:51.217977 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.217964 2569 scope.go:117] "RemoveContainer" containerID="140915271a4ecd672170fe9c6ffdd1c9045f1d0fe16b155be6140718b9d79005" Apr 24 21:25:51.220708 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.220687 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-3062b-predictor-5b6b6b84f6-rv67j"] Apr 24 21:25:51.225347 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.225327 2569 scope.go:117] "RemoveContainer" containerID="eee7a73b69e856087e8c339213c53522e09071c8c2481e8e440f6ebfae3582b5" Apr 24 21:25:51.225623 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:25:51.225602 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eee7a73b69e856087e8c339213c53522e09071c8c2481e8e440f6ebfae3582b5\": container with ID starting with eee7a73b69e856087e8c339213c53522e09071c8c2481e8e440f6ebfae3582b5 not found: ID does not exist" containerID="eee7a73b69e856087e8c339213c53522e09071c8c2481e8e440f6ebfae3582b5" Apr 24 21:25:51.225768 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.225631 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eee7a73b69e856087e8c339213c53522e09071c8c2481e8e440f6ebfae3582b5"} err="failed to get container status \"eee7a73b69e856087e8c339213c53522e09071c8c2481e8e440f6ebfae3582b5\": rpc error: code = NotFound desc = could not find container \"eee7a73b69e856087e8c339213c53522e09071c8c2481e8e440f6ebfae3582b5\": container with ID starting with eee7a73b69e856087e8c339213c53522e09071c8c2481e8e440f6ebfae3582b5 not found: ID does not exist" Apr 24 21:25:51.225768 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.225651 2569 scope.go:117] "RemoveContainer" containerID="35437263e90a3a612c73f29abef75a47d190b4d7b923e62855af79949edb45ac" Apr 24 21:25:51.225923 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:25:51.225902 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35437263e90a3a612c73f29abef75a47d190b4d7b923e62855af79949edb45ac\": container with ID starting with 35437263e90a3a612c73f29abef75a47d190b4d7b923e62855af79949edb45ac not found: ID does not exist" containerID="35437263e90a3a612c73f29abef75a47d190b4d7b923e62855af79949edb45ac" Apr 24 21:25:51.225964 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.225930 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35437263e90a3a612c73f29abef75a47d190b4d7b923e62855af79949edb45ac"} err="failed to get container status \"35437263e90a3a612c73f29abef75a47d190b4d7b923e62855af79949edb45ac\": rpc error: code = NotFound desc = could not find container \"35437263e90a3a612c73f29abef75a47d190b4d7b923e62855af79949edb45ac\": container with ID starting with 35437263e90a3a612c73f29abef75a47d190b4d7b923e62855af79949edb45ac not found: ID does not exist" Apr 24 21:25:51.225964 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.225951 2569 scope.go:117] "RemoveContainer" containerID="a3345d69bd9119018af26bdcf33ccef6afb131a8b7acb4851c5aae4f9b772fb6" Apr 24 21:25:51.226190 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:25:51.226170 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3345d69bd9119018af26bdcf33ccef6afb131a8b7acb4851c5aae4f9b772fb6\": container with ID starting with a3345d69bd9119018af26bdcf33ccef6afb131a8b7acb4851c5aae4f9b772fb6 not found: ID does not exist" containerID="a3345d69bd9119018af26bdcf33ccef6afb131a8b7acb4851c5aae4f9b772fb6" Apr 24 21:25:51.226264 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.226197 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3345d69bd9119018af26bdcf33ccef6afb131a8b7acb4851c5aae4f9b772fb6"} err="failed to get container status \"a3345d69bd9119018af26bdcf33ccef6afb131a8b7acb4851c5aae4f9b772fb6\": rpc error: code = NotFound desc = could not find container \"a3345d69bd9119018af26bdcf33ccef6afb131a8b7acb4851c5aae4f9b772fb6\": container with ID starting with a3345d69bd9119018af26bdcf33ccef6afb131a8b7acb4851c5aae4f9b772fb6 not found: ID does not exist" Apr 24 21:25:51.226264 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.226218 2569 scope.go:117] "RemoveContainer" containerID="140915271a4ecd672170fe9c6ffdd1c9045f1d0fe16b155be6140718b9d79005" Apr 24 21:25:51.226469 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:25:51.226452 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"140915271a4ecd672170fe9c6ffdd1c9045f1d0fe16b155be6140718b9d79005\": container with ID starting with 140915271a4ecd672170fe9c6ffdd1c9045f1d0fe16b155be6140718b9d79005 not found: ID does not exist" containerID="140915271a4ecd672170fe9c6ffdd1c9045f1d0fe16b155be6140718b9d79005" Apr 24 21:25:51.226510 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:51.226474 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"140915271a4ecd672170fe9c6ffdd1c9045f1d0fe16b155be6140718b9d79005"} err="failed to get container status \"140915271a4ecd672170fe9c6ffdd1c9045f1d0fe16b155be6140718b9d79005\": rpc error: code = NotFound desc = could not find container \"140915271a4ecd672170fe9c6ffdd1c9045f1d0fe16b155be6140718b9d79005\": container with ID starting with 140915271a4ecd672170fe9c6ffdd1c9045f1d0fe16b155be6140718b9d79005 not found: ID does not exist" Apr 24 21:25:52.112309 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:52.112271 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:25:52.436594 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:25:52.436558 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" path="/var/lib/kubelet/pods/d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be/volumes" Apr 24 21:26:01.177435 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:26:01.177395 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:26:02.112998 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:26:02.112950 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:26:11.177390 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:26:11.177345 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:26:12.112715 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:26:12.112680 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:26:21.176476 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:26:21.176434 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:26:22.112401 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:26:22.112363 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:26:31.177422 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:26:31.177381 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:26:32.112728 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:26:32.112699 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:26:41.176630 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:26:41.176582 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:26:51.177728 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:26:51.177700 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:27:10.421581 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.421542 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn"] Apr 24 21:27:10.422050 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.421952 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kserve-container" containerID="cri-o://bcf8bbada951118bd8a84283dec246b22772dd30bcfd32bd4d2572b779ee6d3a" gracePeriod=30 Apr 24 21:27:10.422121 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.422028 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kube-rbac-proxy" containerID="cri-o://e10da51d8e9bda31b6949ea5907ca6b10f1add44f2efb9acf8d91c7c5cfc1118" gracePeriod=30 Apr 24 21:27:10.554479 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.554450 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl"] Apr 24 21:27:10.555056 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.554997 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kserve-container" containerID="cri-o://90ec6e1ca3ac6acc4dd3c2a32f8b9be9fd9e457c21117aed53daef493ccc7d59" gracePeriod=30 Apr 24 21:27:10.555179 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.555162 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kube-rbac-proxy" containerID="cri-o://2962436ecead1cf1306c65a787423f9e521c89f2f8af5df67d648af0f9e62ada" gracePeriod=30 Apr 24 21:27:10.557073 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.557049 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb"] Apr 24 21:27:10.557435 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.557420 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kube-rbac-proxy" Apr 24 21:27:10.557523 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.557438 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kube-rbac-proxy" Apr 24 21:27:10.557523 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.557462 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" Apr 24 21:27:10.557523 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.557471 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" Apr 24 21:27:10.557523 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.557482 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" Apr 24 21:27:10.557523 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.557491 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" Apr 24 21:27:10.557523 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.557510 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="storage-initializer" Apr 24 21:27:10.557523 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.557519 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="storage-initializer" Apr 24 21:27:10.557894 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.557603 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kserve-container" Apr 24 21:27:10.557894 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.557620 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="agent" Apr 24 21:27:10.557894 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.557630 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d42a2b82-77ac-4ab0-9ab3-c2f5ea69b8be" containerName="kube-rbac-proxy" Apr 24 21:27:10.568015 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.567997 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:10.570666 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.570649 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-07ea4-predictor-serving-cert\"" Apr 24 21:27:10.570771 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.570654 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\"" Apr 24 21:27:10.577652 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.577633 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb"] Apr 24 21:27:10.605095 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.605072 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/679512e3-c45f-4fa4-8854-212d65ef1285-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:10.605203 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.605146 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/679512e3-c45f-4fa4-8854-212d65ef1285-isvc-sklearn-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:10.605203 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.605170 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzl2t\" (UniqueName: \"kubernetes.io/projected/679512e3-c45f-4fa4-8854-212d65ef1285-kube-api-access-kzl2t\") pod \"isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:10.605309 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.605256 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/679512e3-c45f-4fa4-8854-212d65ef1285-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:10.657945 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.657916 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf"] Apr 24 21:27:10.661633 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.661612 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:10.664452 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.664431 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-07ea4-predictor-serving-cert\"" Apr 24 21:27:10.664551 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.664439 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\"" Apr 24 21:27:10.669648 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.669630 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf"] Apr 24 21:27:10.706268 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.706206 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:10.706268 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.706246 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mc4p\" (UniqueName: \"kubernetes.io/projected/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-kube-api-access-5mc4p\") pod \"isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:10.706441 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.706270 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/679512e3-c45f-4fa4-8854-212d65ef1285-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:10.706441 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.706378 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/679512e3-c45f-4fa4-8854-212d65ef1285-isvc-sklearn-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:10.706441 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.706419 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzl2t\" (UniqueName: \"kubernetes.io/projected/679512e3-c45f-4fa4-8854-212d65ef1285-kube-api-access-kzl2t\") pod \"isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:10.706614 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.706488 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-isvc-xgboost-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:10.706614 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.706535 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:10.706614 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.706586 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/679512e3-c45f-4fa4-8854-212d65ef1285-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:10.707006 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.706987 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/679512e3-c45f-4fa4-8854-212d65ef1285-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:10.707142 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.707121 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/679512e3-c45f-4fa4-8854-212d65ef1285-isvc-sklearn-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:10.708740 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.708724 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/679512e3-c45f-4fa4-8854-212d65ef1285-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:10.714572 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.714554 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzl2t\" (UniqueName: \"kubernetes.io/projected/679512e3-c45f-4fa4-8854-212d65ef1285-kube-api-access-kzl2t\") pod \"isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:10.807260 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.807231 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:10.807396 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.807270 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mc4p\" (UniqueName: \"kubernetes.io/projected/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-kube-api-access-5mc4p\") pod \"isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:10.807396 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.807326 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-isvc-xgboost-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:10.807396 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.807350 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:10.807396 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:27:10.807370 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-serving-cert: secret "isvc-xgboost-graph-raw-hpa-07ea4-predictor-serving-cert" not found Apr 24 21:27:10.807595 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:27:10.807455 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-proxy-tls podName:7f8f1183-7c95-4acf-9967-2c2db7cbe81d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:11.307437511 +0000 UTC m=+619.441836110 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-proxy-tls") pod "isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" (UID: "7f8f1183-7c95-4acf-9967-2c2db7cbe81d") : secret "isvc-xgboost-graph-raw-hpa-07ea4-predictor-serving-cert" not found Apr 24 21:27:10.807679 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.807661 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:10.808008 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.807988 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-isvc-xgboost-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:10.815474 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.815451 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mc4p\" (UniqueName: \"kubernetes.io/projected/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-kube-api-access-5mc4p\") pod \"isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:10.878117 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.878089 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:10.992476 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:10.992452 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb"] Apr 24 21:27:10.994607 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:27:10.994567 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod679512e3_c45f_4fa4_8854_212d65ef1285.slice/crio-42a4b1be7f26ca87df64009bbe1d9023dfa9d0b9116473425e2ded56c8a94773 WatchSource:0}: Error finding container 42a4b1be7f26ca87df64009bbe1d9023dfa9d0b9116473425e2ded56c8a94773: Status 404 returned error can't find the container with id 42a4b1be7f26ca87df64009bbe1d9023dfa9d0b9116473425e2ded56c8a94773 Apr 24 21:27:11.171838 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:11.171749 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 24 21:27:11.177409 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:11.177378 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:27:11.311894 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:11.311858 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:11.314274 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:11.314241 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:11.434362 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:11.434333 2569 generic.go:358] "Generic (PLEG): container finished" podID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerID="2962436ecead1cf1306c65a787423f9e521c89f2f8af5df67d648af0f9e62ada" exitCode=2 Apr 24 21:27:11.434802 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:11.434397 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" event={"ID":"a347c45e-1f6f-4252-a817-9390a6dd3ec5","Type":"ContainerDied","Data":"2962436ecead1cf1306c65a787423f9e521c89f2f8af5df67d648af0f9e62ada"} Apr 24 21:27:11.436584 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:11.436517 2569 generic.go:358] "Generic (PLEG): container finished" podID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerID="e10da51d8e9bda31b6949ea5907ca6b10f1add44f2efb9acf8d91c7c5cfc1118" exitCode=2 Apr 24 21:27:11.436584 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:11.436544 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" event={"ID":"d35a9f7d-8a52-4cc0-a507-a704d6a2a614","Type":"ContainerDied","Data":"e10da51d8e9bda31b6949ea5907ca6b10f1add44f2efb9acf8d91c7c5cfc1118"} Apr 24 21:27:11.438313 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:11.438291 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" event={"ID":"679512e3-c45f-4fa4-8854-212d65ef1285","Type":"ContainerStarted","Data":"cfe2aaf45801b4b5ddbed3735f80a5951c7e61ac264b3bea633c0d94680972d2"} Apr 24 21:27:11.438413 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:11.438320 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" event={"ID":"679512e3-c45f-4fa4-8854-212d65ef1285","Type":"ContainerStarted","Data":"42a4b1be7f26ca87df64009bbe1d9023dfa9d0b9116473425e2ded56c8a94773"} Apr 24 21:27:11.572969 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:11.572888 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:11.692524 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:11.692500 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf"] Apr 24 21:27:11.695034 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:27:11.695007 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f8f1183_7c95_4acf_9967_2c2db7cbe81d.slice/crio-69bf0d2c628dcc17177252f0444d1f6bc46dab1192f0c60a1b2b77d5a6c89e9e WatchSource:0}: Error finding container 69bf0d2c628dcc17177252f0444d1f6bc46dab1192f0c60a1b2b77d5a6c89e9e: Status 404 returned error can't find the container with id 69bf0d2c628dcc17177252f0444d1f6bc46dab1192f0c60a1b2b77d5a6c89e9e Apr 24 21:27:12.108126 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:12.108082 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 24 21:27:12.112456 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:12.112422 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:27:12.445624 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:12.445536 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" event={"ID":"7f8f1183-7c95-4acf-9967-2c2db7cbe81d","Type":"ContainerStarted","Data":"c7afbfbefb3b03399418e6f82b1f20315cbc8840afe39f97b3a07a1212c342b2"} Apr 24 21:27:12.445624 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:12.445576 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" event={"ID":"7f8f1183-7c95-4acf-9967-2c2db7cbe81d","Type":"ContainerStarted","Data":"69bf0d2c628dcc17177252f0444d1f6bc46dab1192f0c60a1b2b77d5a6c89e9e"} Apr 24 21:27:13.891740 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:13.891720 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:27:13.932854 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:13.932832 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-da412-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a347c45e-1f6f-4252-a817-9390a6dd3ec5-isvc-xgboost-graph-raw-da412-kube-rbac-proxy-sar-config\") pod \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " Apr 24 21:27:13.932949 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:13.932887 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a347c45e-1f6f-4252-a817-9390a6dd3ec5-kserve-provision-location\") pod \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " Apr 24 21:27:13.932995 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:13.932950 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a347c45e-1f6f-4252-a817-9390a6dd3ec5-proxy-tls\") pod \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " Apr 24 21:27:13.932995 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:13.932989 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmsxx\" (UniqueName: \"kubernetes.io/projected/a347c45e-1f6f-4252-a817-9390a6dd3ec5-kube-api-access-nmsxx\") pod \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\" (UID: \"a347c45e-1f6f-4252-a817-9390a6dd3ec5\") " Apr 24 21:27:13.933218 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:13.933191 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a347c45e-1f6f-4252-a817-9390a6dd3ec5-isvc-xgboost-graph-raw-da412-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-da412-kube-rbac-proxy-sar-config") pod "a347c45e-1f6f-4252-a817-9390a6dd3ec5" (UID: "a347c45e-1f6f-4252-a817-9390a6dd3ec5"). InnerVolumeSpecName "isvc-xgboost-graph-raw-da412-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:27:13.933271 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:13.933215 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a347c45e-1f6f-4252-a817-9390a6dd3ec5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a347c45e-1f6f-4252-a817-9390a6dd3ec5" (UID: "a347c45e-1f6f-4252-a817-9390a6dd3ec5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:27:13.934994 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:13.934965 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a347c45e-1f6f-4252-a817-9390a6dd3ec5-kube-api-access-nmsxx" (OuterVolumeSpecName: "kube-api-access-nmsxx") pod "a347c45e-1f6f-4252-a817-9390a6dd3ec5" (UID: "a347c45e-1f6f-4252-a817-9390a6dd3ec5"). InnerVolumeSpecName "kube-api-access-nmsxx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:27:13.934994 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:13.934967 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a347c45e-1f6f-4252-a817-9390a6dd3ec5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a347c45e-1f6f-4252-a817-9390a6dd3ec5" (UID: "a347c45e-1f6f-4252-a817-9390a6dd3ec5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:27:14.034408 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.034385 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a347c45e-1f6f-4252-a817-9390a6dd3ec5-kserve-provision-location\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:27:14.034511 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.034410 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a347c45e-1f6f-4252-a817-9390a6dd3ec5-proxy-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:27:14.034511 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.034420 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmsxx\" (UniqueName: \"kubernetes.io/projected/a347c45e-1f6f-4252-a817-9390a6dd3ec5-kube-api-access-nmsxx\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:27:14.034511 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.034430 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-da412-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a347c45e-1f6f-4252-a817-9390a6dd3ec5-isvc-xgboost-graph-raw-da412-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:27:14.451996 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.451969 2569 generic.go:358] "Generic (PLEG): container finished" podID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerID="90ec6e1ca3ac6acc4dd3c2a32f8b9be9fd9e457c21117aed53daef493ccc7d59" exitCode=0 Apr 24 21:27:14.452123 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.452038 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" Apr 24 21:27:14.452123 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.452050 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" event={"ID":"a347c45e-1f6f-4252-a817-9390a6dd3ec5","Type":"ContainerDied","Data":"90ec6e1ca3ac6acc4dd3c2a32f8b9be9fd9e457c21117aed53daef493ccc7d59"} Apr 24 21:27:14.452123 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.452109 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl" event={"ID":"a347c45e-1f6f-4252-a817-9390a6dd3ec5","Type":"ContainerDied","Data":"5ffc36c0cfcbea10bd684491d954ec2e85d10e29164b02f4283d798aa40e62f5"} Apr 24 21:27:14.452278 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.452129 2569 scope.go:117] "RemoveContainer" containerID="2962436ecead1cf1306c65a787423f9e521c89f2f8af5df67d648af0f9e62ada" Apr 24 21:27:14.459892 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.459872 2569 scope.go:117] "RemoveContainer" containerID="90ec6e1ca3ac6acc4dd3c2a32f8b9be9fd9e457c21117aed53daef493ccc7d59" Apr 24 21:27:14.466922 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.466903 2569 scope.go:117] "RemoveContainer" containerID="2198c9eb723cc400708ec5fba3ef0ea5130f9d76cf009bdc59a75f7c320705b0" Apr 24 21:27:14.469024 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.469000 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl"] Apr 24 21:27:14.471691 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.471670 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-da412-predictor-9b7c6cff7-xdnhl"] Apr 24 21:27:14.474461 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.474447 2569 scope.go:117] "RemoveContainer" containerID="2962436ecead1cf1306c65a787423f9e521c89f2f8af5df67d648af0f9e62ada" Apr 24 21:27:14.474702 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:27:14.474684 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2962436ecead1cf1306c65a787423f9e521c89f2f8af5df67d648af0f9e62ada\": container with ID starting with 2962436ecead1cf1306c65a787423f9e521c89f2f8af5df67d648af0f9e62ada not found: ID does not exist" containerID="2962436ecead1cf1306c65a787423f9e521c89f2f8af5df67d648af0f9e62ada" Apr 24 21:27:14.474805 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.474710 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2962436ecead1cf1306c65a787423f9e521c89f2f8af5df67d648af0f9e62ada"} err="failed to get container status \"2962436ecead1cf1306c65a787423f9e521c89f2f8af5df67d648af0f9e62ada\": rpc error: code = NotFound desc = could not find container \"2962436ecead1cf1306c65a787423f9e521c89f2f8af5df67d648af0f9e62ada\": container with ID starting with 2962436ecead1cf1306c65a787423f9e521c89f2f8af5df67d648af0f9e62ada not found: ID does not exist" Apr 24 21:27:14.474805 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.474726 2569 scope.go:117] "RemoveContainer" containerID="90ec6e1ca3ac6acc4dd3c2a32f8b9be9fd9e457c21117aed53daef493ccc7d59" Apr 24 21:27:14.475041 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:27:14.475025 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90ec6e1ca3ac6acc4dd3c2a32f8b9be9fd9e457c21117aed53daef493ccc7d59\": container with ID starting with 90ec6e1ca3ac6acc4dd3c2a32f8b9be9fd9e457c21117aed53daef493ccc7d59 not found: ID does not exist" containerID="90ec6e1ca3ac6acc4dd3c2a32f8b9be9fd9e457c21117aed53daef493ccc7d59" Apr 24 21:27:14.475080 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.475048 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ec6e1ca3ac6acc4dd3c2a32f8b9be9fd9e457c21117aed53daef493ccc7d59"} err="failed to get container status \"90ec6e1ca3ac6acc4dd3c2a32f8b9be9fd9e457c21117aed53daef493ccc7d59\": rpc error: code = NotFound desc = could not find container \"90ec6e1ca3ac6acc4dd3c2a32f8b9be9fd9e457c21117aed53daef493ccc7d59\": container with ID starting with 90ec6e1ca3ac6acc4dd3c2a32f8b9be9fd9e457c21117aed53daef493ccc7d59 not found: ID does not exist" Apr 24 21:27:14.475080 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.475063 2569 scope.go:117] "RemoveContainer" containerID="2198c9eb723cc400708ec5fba3ef0ea5130f9d76cf009bdc59a75f7c320705b0" Apr 24 21:27:14.475301 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:27:14.475284 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2198c9eb723cc400708ec5fba3ef0ea5130f9d76cf009bdc59a75f7c320705b0\": container with ID starting with 2198c9eb723cc400708ec5fba3ef0ea5130f9d76cf009bdc59a75f7c320705b0 not found: ID does not exist" containerID="2198c9eb723cc400708ec5fba3ef0ea5130f9d76cf009bdc59a75f7c320705b0" Apr 24 21:27:14.475344 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.475306 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2198c9eb723cc400708ec5fba3ef0ea5130f9d76cf009bdc59a75f7c320705b0"} err="failed to get container status \"2198c9eb723cc400708ec5fba3ef0ea5130f9d76cf009bdc59a75f7c320705b0\": rpc error: code = NotFound desc = could not find container \"2198c9eb723cc400708ec5fba3ef0ea5130f9d76cf009bdc59a75f7c320705b0\": container with ID starting with 2198c9eb723cc400708ec5fba3ef0ea5130f9d76cf009bdc59a75f7c320705b0 not found: ID does not exist" Apr 24 21:27:14.748428 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.748406 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:27:14.840371 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.840343 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-da412-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-isvc-sklearn-graph-raw-da412-kube-rbac-proxy-sar-config\") pod \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " Apr 24 21:27:14.840516 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.840379 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-kserve-provision-location\") pod \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " Apr 24 21:27:14.840516 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.840408 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9vcl\" (UniqueName: \"kubernetes.io/projected/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-kube-api-access-l9vcl\") pod \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " Apr 24 21:27:14.840516 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.840495 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-proxy-tls\") pod \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\" (UID: \"d35a9f7d-8a52-4cc0-a507-a704d6a2a614\") " Apr 24 21:27:14.840695 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.840670 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d35a9f7d-8a52-4cc0-a507-a704d6a2a614" (UID: "d35a9f7d-8a52-4cc0-a507-a704d6a2a614"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:27:14.840809 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.840784 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-isvc-sklearn-graph-raw-da412-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-da412-kube-rbac-proxy-sar-config") pod "d35a9f7d-8a52-4cc0-a507-a704d6a2a614" (UID: "d35a9f7d-8a52-4cc0-a507-a704d6a2a614"). InnerVolumeSpecName "isvc-sklearn-graph-raw-da412-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:27:14.842519 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.842498 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-kube-api-access-l9vcl" (OuterVolumeSpecName: "kube-api-access-l9vcl") pod "d35a9f7d-8a52-4cc0-a507-a704d6a2a614" (UID: "d35a9f7d-8a52-4cc0-a507-a704d6a2a614"). InnerVolumeSpecName "kube-api-access-l9vcl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:27:14.842609 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.842537 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d35a9f7d-8a52-4cc0-a507-a704d6a2a614" (UID: "d35a9f7d-8a52-4cc0-a507-a704d6a2a614"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:27:14.941631 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.941608 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-proxy-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:27:14.941631 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.941628 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-da412-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-isvc-sklearn-graph-raw-da412-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:27:14.941977 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.941639 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-kserve-provision-location\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:27:14.941977 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:14.941648 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l9vcl\" (UniqueName: \"kubernetes.io/projected/d35a9f7d-8a52-4cc0-a507-a704d6a2a614-kube-api-access-l9vcl\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:27:15.456172 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.456143 2569 generic.go:358] "Generic (PLEG): container finished" podID="679512e3-c45f-4fa4-8854-212d65ef1285" containerID="cfe2aaf45801b4b5ddbed3735f80a5951c7e61ac264b3bea633c0d94680972d2" exitCode=0 Apr 24 21:27:15.456294 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.456213 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" event={"ID":"679512e3-c45f-4fa4-8854-212d65ef1285","Type":"ContainerDied","Data":"cfe2aaf45801b4b5ddbed3735f80a5951c7e61ac264b3bea633c0d94680972d2"} Apr 24 21:27:15.458724 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.458695 2569 generic.go:358] "Generic (PLEG): container finished" podID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerID="bcf8bbada951118bd8a84283dec246b22772dd30bcfd32bd4d2572b779ee6d3a" exitCode=0 Apr 24 21:27:15.458842 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.458781 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" event={"ID":"d35a9f7d-8a52-4cc0-a507-a704d6a2a614","Type":"ContainerDied","Data":"bcf8bbada951118bd8a84283dec246b22772dd30bcfd32bd4d2572b779ee6d3a"} Apr 24 21:27:15.458842 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.458797 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" Apr 24 21:27:15.458842 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.458806 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn" event={"ID":"d35a9f7d-8a52-4cc0-a507-a704d6a2a614","Type":"ContainerDied","Data":"4b45ede71b92970f111c876798f93ba51a8580e64e30838bb3c0673969c408e7"} Apr 24 21:27:15.458842 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.458829 2569 scope.go:117] "RemoveContainer" containerID="e10da51d8e9bda31b6949ea5907ca6b10f1add44f2efb9acf8d91c7c5cfc1118" Apr 24 21:27:15.460241 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.460213 2569 generic.go:358] "Generic (PLEG): container finished" podID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerID="c7afbfbefb3b03399418e6f82b1f20315cbc8840afe39f97b3a07a1212c342b2" exitCode=0 Apr 24 21:27:15.460324 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.460266 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" event={"ID":"7f8f1183-7c95-4acf-9967-2c2db7cbe81d","Type":"ContainerDied","Data":"c7afbfbefb3b03399418e6f82b1f20315cbc8840afe39f97b3a07a1212c342b2"} Apr 24 21:27:15.470153 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.470137 2569 scope.go:117] "RemoveContainer" containerID="bcf8bbada951118bd8a84283dec246b22772dd30bcfd32bd4d2572b779ee6d3a" Apr 24 21:27:15.485689 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.485668 2569 scope.go:117] "RemoveContainer" containerID="c5b5bab40dbfbe503019af752df7f9144147a6769e71b6ccbdb8bcab6e3ad439" Apr 24 21:27:15.496911 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.496891 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn"] Apr 24 21:27:15.502035 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.502016 2569 scope.go:117] "RemoveContainer" containerID="e10da51d8e9bda31b6949ea5907ca6b10f1add44f2efb9acf8d91c7c5cfc1118" Apr 24 21:27:15.502330 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.502307 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-da412-predictor-86b6b578d8-7vthn"] Apr 24 21:27:15.502521 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:27:15.502498 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e10da51d8e9bda31b6949ea5907ca6b10f1add44f2efb9acf8d91c7c5cfc1118\": container with ID starting with e10da51d8e9bda31b6949ea5907ca6b10f1add44f2efb9acf8d91c7c5cfc1118 not found: ID does not exist" containerID="e10da51d8e9bda31b6949ea5907ca6b10f1add44f2efb9acf8d91c7c5cfc1118" Apr 24 21:27:15.502609 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.502529 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e10da51d8e9bda31b6949ea5907ca6b10f1add44f2efb9acf8d91c7c5cfc1118"} err="failed to get container status \"e10da51d8e9bda31b6949ea5907ca6b10f1add44f2efb9acf8d91c7c5cfc1118\": rpc error: code = NotFound desc = could not find container \"e10da51d8e9bda31b6949ea5907ca6b10f1add44f2efb9acf8d91c7c5cfc1118\": container with ID starting with e10da51d8e9bda31b6949ea5907ca6b10f1add44f2efb9acf8d91c7c5cfc1118 not found: ID does not exist" Apr 24 21:27:15.502609 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.502554 2569 scope.go:117] "RemoveContainer" containerID="bcf8bbada951118bd8a84283dec246b22772dd30bcfd32bd4d2572b779ee6d3a" Apr 24 21:27:15.502885 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:27:15.502859 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcf8bbada951118bd8a84283dec246b22772dd30bcfd32bd4d2572b779ee6d3a\": container with ID starting with bcf8bbada951118bd8a84283dec246b22772dd30bcfd32bd4d2572b779ee6d3a not found: ID does not exist" containerID="bcf8bbada951118bd8a84283dec246b22772dd30bcfd32bd4d2572b779ee6d3a" Apr 24 21:27:15.502980 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.502892 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcf8bbada951118bd8a84283dec246b22772dd30bcfd32bd4d2572b779ee6d3a"} err="failed to get container status \"bcf8bbada951118bd8a84283dec246b22772dd30bcfd32bd4d2572b779ee6d3a\": rpc error: code = NotFound desc = could not find container \"bcf8bbada951118bd8a84283dec246b22772dd30bcfd32bd4d2572b779ee6d3a\": container with ID starting with bcf8bbada951118bd8a84283dec246b22772dd30bcfd32bd4d2572b779ee6d3a not found: ID does not exist" Apr 24 21:27:15.502980 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.502916 2569 scope.go:117] "RemoveContainer" containerID="c5b5bab40dbfbe503019af752df7f9144147a6769e71b6ccbdb8bcab6e3ad439" Apr 24 21:27:15.503215 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:27:15.503195 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5b5bab40dbfbe503019af752df7f9144147a6769e71b6ccbdb8bcab6e3ad439\": container with ID starting with c5b5bab40dbfbe503019af752df7f9144147a6769e71b6ccbdb8bcab6e3ad439 not found: ID does not exist" containerID="c5b5bab40dbfbe503019af752df7f9144147a6769e71b6ccbdb8bcab6e3ad439" Apr 24 21:27:15.503296 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:15.503221 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5b5bab40dbfbe503019af752df7f9144147a6769e71b6ccbdb8bcab6e3ad439"} err="failed to get container status \"c5b5bab40dbfbe503019af752df7f9144147a6769e71b6ccbdb8bcab6e3ad439\": rpc error: code = NotFound desc = could not find container \"c5b5bab40dbfbe503019af752df7f9144147a6769e71b6ccbdb8bcab6e3ad439\": container with ID starting with c5b5bab40dbfbe503019af752df7f9144147a6769e71b6ccbdb8bcab6e3ad439 not found: ID does not exist" Apr 24 21:27:16.437193 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:16.437158 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" path="/var/lib/kubelet/pods/a347c45e-1f6f-4252-a817-9390a6dd3ec5/volumes" Apr 24 21:27:16.437918 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:16.437895 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" path="/var/lib/kubelet/pods/d35a9f7d-8a52-4cc0-a507-a704d6a2a614/volumes" Apr 24 21:27:16.465200 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:16.465175 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" event={"ID":"679512e3-c45f-4fa4-8854-212d65ef1285","Type":"ContainerStarted","Data":"929ab33c7eb3c58801339ba7788a5f7eeb1feeb84c0bd631cdc075b06c459261"} Apr 24 21:27:16.465332 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:16.465212 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" event={"ID":"679512e3-c45f-4fa4-8854-212d65ef1285","Type":"ContainerStarted","Data":"134a96822dab28fadd0af904a942df79829cf0de4ff9ea6000c3913a5cbf0bb2"} Apr 24 21:27:16.465417 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:16.465401 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:16.467534 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:16.467516 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" event={"ID":"7f8f1183-7c95-4acf-9967-2c2db7cbe81d","Type":"ContainerStarted","Data":"82975e02ebe86d4d5f0145eaabb448d3cdd4b0f0ce67e603e73c5fc5a83d0f60"} Apr 24 21:27:16.467628 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:16.467538 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" event={"ID":"7f8f1183-7c95-4acf-9967-2c2db7cbe81d","Type":"ContainerStarted","Data":"9fee4fe870a124386aeff04ba12490805bbbaab280ce7eb53e927a6f44f272d0"} Apr 24 21:27:16.467737 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:16.467723 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:16.467796 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:16.467749 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:16.469170 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:16.469147 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:27:16.486855 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:16.486815 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" podStartSLOduration=6.486803055 podStartE2EDuration="6.486803055s" podCreationTimestamp="2026-04-24 21:27:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:16.485095009 +0000 UTC m=+624.619493651" watchObservedRunningTime="2026-04-24 21:27:16.486803055 +0000 UTC m=+624.621201677" Apr 24 21:27:16.505502 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:16.505468 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" podStartSLOduration=6.505454129 podStartE2EDuration="6.505454129s" podCreationTimestamp="2026-04-24 21:27:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:16.504690421 +0000 UTC m=+624.639089045" watchObservedRunningTime="2026-04-24 21:27:16.505454129 +0000 UTC m=+624.639852760" Apr 24 21:27:17.471104 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:17.471051 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:27:17.471515 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:17.471197 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:17.472237 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:17.472214 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 21:27:18.473322 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:18.473280 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 21:27:22.480615 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:22.480588 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:27:22.481195 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:22.481164 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:27:23.478152 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:23.478121 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:27:23.478703 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:23.478674 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 21:27:32.481306 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:32.481263 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:27:33.479254 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:33.479221 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 21:27:42.481524 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:42.481486 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:27:43.479079 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:43.479044 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 21:27:52.481578 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:52.481544 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:27:53.478779 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:27:53.478726 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 21:28:02.481270 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:02.481229 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:28:03.479427 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:03.479391 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 21:28:12.481940 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:12.481910 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:28:13.478926 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:13.478887 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 21:28:23.479923 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:23.479894 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:28:50.740591 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.740515 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb"] Apr 24 21:28:50.741108 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.740884 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kserve-container" containerID="cri-o://134a96822dab28fadd0af904a942df79829cf0de4ff9ea6000c3913a5cbf0bb2" gracePeriod=30 Apr 24 21:28:50.741108 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.740914 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kube-rbac-proxy" containerID="cri-o://929ab33c7eb3c58801339ba7788a5f7eeb1feeb84c0bd631cdc075b06c459261" gracePeriod=30 Apr 24 21:28:50.779936 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.779910 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs"] Apr 24 21:28:50.780335 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780292 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kserve-container" Apr 24 21:28:50.780335 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780308 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kserve-container" Apr 24 21:28:50.780335 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780323 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="storage-initializer" Apr 24 21:28:50.780335 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780329 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="storage-initializer" Apr 24 21:28:50.780482 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780352 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kube-rbac-proxy" Apr 24 21:28:50.780482 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780358 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kube-rbac-proxy" Apr 24 21:28:50.780482 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780366 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="storage-initializer" Apr 24 21:28:50.780482 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780372 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="storage-initializer" Apr 24 21:28:50.780482 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780380 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kserve-container" Apr 24 21:28:50.780482 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780388 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kserve-container" Apr 24 21:28:50.780482 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780409 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kube-rbac-proxy" Apr 24 21:28:50.780482 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780418 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kube-rbac-proxy" Apr 24 21:28:50.780736 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780502 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kube-rbac-proxy" Apr 24 21:28:50.780736 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780510 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kube-rbac-proxy" Apr 24 21:28:50.780736 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780517 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d35a9f7d-8a52-4cc0-a507-a704d6a2a614" containerName="kserve-container" Apr 24 21:28:50.780736 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.780526 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a347c45e-1f6f-4252-a817-9390a6dd3ec5" containerName="kserve-container" Apr 24 21:28:50.783822 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.783807 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:28:50.786335 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.786317 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-66937-predictor-serving-cert\"" Apr 24 21:28:50.786433 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.786407 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-66937-kube-rbac-proxy-sar-config\"" Apr 24 21:28:50.794082 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.794024 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs"] Apr 24 21:28:50.820019 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.819993 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf"] Apr 24 21:28:50.820323 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.820298 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kserve-container" containerID="cri-o://9fee4fe870a124386aeff04ba12490805bbbaab280ce7eb53e927a6f44f272d0" gracePeriod=30 Apr 24 21:28:50.820425 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.820333 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kube-rbac-proxy" containerID="cri-o://82975e02ebe86d4d5f0145eaabb448d3cdd4b0f0ce67e603e73c5fc5a83d0f60" gracePeriod=30 Apr 24 21:28:50.923337 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.923310 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-raw-66937-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cf53e686-46b5-470b-90cf-27b7cba22991-message-dumper-raw-66937-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-66937-predictor-86db465969-zxjgs\" (UID: \"cf53e686-46b5-470b-90cf-27b7cba22991\") " pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:28:50.923456 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.923362 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf53e686-46b5-470b-90cf-27b7cba22991-proxy-tls\") pod \"message-dumper-raw-66937-predictor-86db465969-zxjgs\" (UID: \"cf53e686-46b5-470b-90cf-27b7cba22991\") " pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:28:50.923456 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:50.923385 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjhgc\" (UniqueName: \"kubernetes.io/projected/cf53e686-46b5-470b-90cf-27b7cba22991-kube-api-access-kjhgc\") pod \"message-dumper-raw-66937-predictor-86db465969-zxjgs\" (UID: \"cf53e686-46b5-470b-90cf-27b7cba22991\") " pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:28:51.024131 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:51.024075 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-raw-66937-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cf53e686-46b5-470b-90cf-27b7cba22991-message-dumper-raw-66937-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-66937-predictor-86db465969-zxjgs\" (UID: \"cf53e686-46b5-470b-90cf-27b7cba22991\") " pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:28:51.024236 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:51.024128 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf53e686-46b5-470b-90cf-27b7cba22991-proxy-tls\") pod \"message-dumper-raw-66937-predictor-86db465969-zxjgs\" (UID: \"cf53e686-46b5-470b-90cf-27b7cba22991\") " pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:28:51.024236 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:51.024157 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjhgc\" (UniqueName: \"kubernetes.io/projected/cf53e686-46b5-470b-90cf-27b7cba22991-kube-api-access-kjhgc\") pod \"message-dumper-raw-66937-predictor-86db465969-zxjgs\" (UID: \"cf53e686-46b5-470b-90cf-27b7cba22991\") " pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:28:51.024322 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:28:51.024285 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-raw-66937-predictor-serving-cert: secret "message-dumper-raw-66937-predictor-serving-cert" not found Apr 24 21:28:51.024386 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:28:51.024376 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf53e686-46b5-470b-90cf-27b7cba22991-proxy-tls podName:cf53e686-46b5-470b-90cf-27b7cba22991 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:51.524356392 +0000 UTC m=+719.658755006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cf53e686-46b5-470b-90cf-27b7cba22991-proxy-tls") pod "message-dumper-raw-66937-predictor-86db465969-zxjgs" (UID: "cf53e686-46b5-470b-90cf-27b7cba22991") : secret "message-dumper-raw-66937-predictor-serving-cert" not found Apr 24 21:28:51.024674 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:51.024653 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-raw-66937-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cf53e686-46b5-470b-90cf-27b7cba22991-message-dumper-raw-66937-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-66937-predictor-86db465969-zxjgs\" (UID: \"cf53e686-46b5-470b-90cf-27b7cba22991\") " pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:28:51.032684 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:51.032660 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjhgc\" (UniqueName: \"kubernetes.io/projected/cf53e686-46b5-470b-90cf-27b7cba22991-kube-api-access-kjhgc\") pod \"message-dumper-raw-66937-predictor-86db465969-zxjgs\" (UID: \"cf53e686-46b5-470b-90cf-27b7cba22991\") " pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:28:51.527730 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:51.527694 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf53e686-46b5-470b-90cf-27b7cba22991-proxy-tls\") pod \"message-dumper-raw-66937-predictor-86db465969-zxjgs\" (UID: \"cf53e686-46b5-470b-90cf-27b7cba22991\") " pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:28:51.530050 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:51.530025 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf53e686-46b5-470b-90cf-27b7cba22991-proxy-tls\") pod \"message-dumper-raw-66937-predictor-86db465969-zxjgs\" (UID: \"cf53e686-46b5-470b-90cf-27b7cba22991\") " pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:28:51.695177 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:51.695147 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:28:51.751922 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:51.751887 2569 generic.go:358] "Generic (PLEG): container finished" podID="679512e3-c45f-4fa4-8854-212d65ef1285" containerID="929ab33c7eb3c58801339ba7788a5f7eeb1feeb84c0bd631cdc075b06c459261" exitCode=2 Apr 24 21:28:51.752308 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:51.752004 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" event={"ID":"679512e3-c45f-4fa4-8854-212d65ef1285","Type":"ContainerDied","Data":"929ab33c7eb3c58801339ba7788a5f7eeb1feeb84c0bd631cdc075b06c459261"} Apr 24 21:28:51.754691 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:51.754660 2569 generic.go:358] "Generic (PLEG): container finished" podID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerID="82975e02ebe86d4d5f0145eaabb448d3cdd4b0f0ce67e603e73c5fc5a83d0f60" exitCode=2 Apr 24 21:28:51.754843 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:51.754735 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" event={"ID":"7f8f1183-7c95-4acf-9967-2c2db7cbe81d","Type":"ContainerDied","Data":"82975e02ebe86d4d5f0145eaabb448d3cdd4b0f0ce67e603e73c5fc5a83d0f60"} Apr 24 21:28:51.813663 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:51.813601 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs"] Apr 24 21:28:51.816505 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:28:51.816480 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf53e686_46b5_470b_90cf_27b7cba22991.slice/crio-5ca68c2784417db9d6305992365d2355315702128cf84cfdb47870c2c796d2f9 WatchSource:0}: Error finding container 5ca68c2784417db9d6305992365d2355315702128cf84cfdb47870c2c796d2f9: Status 404 returned error can't find the container with id 5ca68c2784417db9d6305992365d2355315702128cf84cfdb47870c2c796d2f9 Apr 24 21:28:51.818202 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:51.818187 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:28:52.471605 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:52.471574 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 24 21:28:52.481477 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:52.481449 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:28:52.758348 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:52.758274 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" event={"ID":"cf53e686-46b5-470b-90cf-27b7cba22991","Type":"ContainerStarted","Data":"5ca68c2784417db9d6305992365d2355315702128cf84cfdb47870c2c796d2f9"} Apr 24 21:28:53.474233 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:53.474152 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 24 21:28:53.478604 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:53.478563 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 21:28:53.762776 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:53.762658 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" event={"ID":"cf53e686-46b5-470b-90cf-27b7cba22991","Type":"ContainerStarted","Data":"88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075"} Apr 24 21:28:53.762776 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:53.762696 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" event={"ID":"cf53e686-46b5-470b-90cf-27b7cba22991","Type":"ContainerStarted","Data":"9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3"} Apr 24 21:28:53.763271 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:53.762806 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:28:53.781547 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:53.781498 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" podStartSLOduration=2.463618078 podStartE2EDuration="3.781484181s" podCreationTimestamp="2026-04-24 21:28:50 +0000 UTC" firstStartedPulling="2026-04-24 21:28:51.818307397 +0000 UTC m=+719.952705997" lastFinishedPulling="2026-04-24 21:28:53.136173498 +0000 UTC m=+721.270572100" observedRunningTime="2026-04-24 21:28:53.78069277 +0000 UTC m=+721.915091392" watchObservedRunningTime="2026-04-24 21:28:53.781484181 +0000 UTC m=+721.915882802" Apr 24 21:28:54.460168 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.460148 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:28:54.554064 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.554035 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-kserve-provision-location\") pod \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " Apr 24 21:28:54.554199 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.554077 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mc4p\" (UniqueName: \"kubernetes.io/projected/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-kube-api-access-5mc4p\") pod \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " Apr 24 21:28:54.554199 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.554131 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-isvc-xgboost-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\") pod \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " Apr 24 21:28:54.554199 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.554179 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-proxy-tls\") pod \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\" (UID: \"7f8f1183-7c95-4acf-9967-2c2db7cbe81d\") " Apr 24 21:28:54.554397 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.554369 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7f8f1183-7c95-4acf-9967-2c2db7cbe81d" (UID: "7f8f1183-7c95-4acf-9967-2c2db7cbe81d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:28:54.554515 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.554489 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-isvc-xgboost-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config") pod "7f8f1183-7c95-4acf-9967-2c2db7cbe81d" (UID: "7f8f1183-7c95-4acf-9967-2c2db7cbe81d"). InnerVolumeSpecName "isvc-xgboost-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:28:54.556218 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.556193 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-kube-api-access-5mc4p" (OuterVolumeSpecName: "kube-api-access-5mc4p") pod "7f8f1183-7c95-4acf-9967-2c2db7cbe81d" (UID: "7f8f1183-7c95-4acf-9967-2c2db7cbe81d"). InnerVolumeSpecName "kube-api-access-5mc4p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:28:54.556303 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.556256 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7f8f1183-7c95-4acf-9967-2c2db7cbe81d" (UID: "7f8f1183-7c95-4acf-9967-2c2db7cbe81d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:28:54.655164 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.655138 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-isvc-xgboost-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:28:54.655164 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.655164 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-proxy-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:28:54.655327 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.655179 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-kserve-provision-location\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:28:54.655327 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.655193 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5mc4p\" (UniqueName: \"kubernetes.io/projected/7f8f1183-7c95-4acf-9967-2c2db7cbe81d-kube-api-access-5mc4p\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:28:54.767133 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.767106 2569 generic.go:358] "Generic (PLEG): container finished" podID="679512e3-c45f-4fa4-8854-212d65ef1285" containerID="134a96822dab28fadd0af904a942df79829cf0de4ff9ea6000c3913a5cbf0bb2" exitCode=0 Apr 24 21:28:54.767419 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.767177 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" event={"ID":"679512e3-c45f-4fa4-8854-212d65ef1285","Type":"ContainerDied","Data":"134a96822dab28fadd0af904a942df79829cf0de4ff9ea6000c3913a5cbf0bb2"} Apr 24 21:28:54.768905 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.768883 2569 generic.go:358] "Generic (PLEG): container finished" podID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerID="9fee4fe870a124386aeff04ba12490805bbbaab280ce7eb53e927a6f44f272d0" exitCode=0 Apr 24 21:28:54.769001 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.768911 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" event={"ID":"7f8f1183-7c95-4acf-9967-2c2db7cbe81d","Type":"ContainerDied","Data":"9fee4fe870a124386aeff04ba12490805bbbaab280ce7eb53e927a6f44f272d0"} Apr 24 21:28:54.769001 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.768948 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" event={"ID":"7f8f1183-7c95-4acf-9967-2c2db7cbe81d","Type":"ContainerDied","Data":"69bf0d2c628dcc17177252f0444d1f6bc46dab1192f0c60a1b2b77d5a6c89e9e"} Apr 24 21:28:54.769001 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.768972 2569 scope.go:117] "RemoveContainer" containerID="82975e02ebe86d4d5f0145eaabb448d3cdd4b0f0ce67e603e73c5fc5a83d0f60" Apr 24 21:28:54.769001 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.768995 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf" Apr 24 21:28:54.769520 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.769502 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:28:54.771796 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.771775 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:28:54.790346 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.790328 2569 scope.go:117] "RemoveContainer" containerID="9fee4fe870a124386aeff04ba12490805bbbaab280ce7eb53e927a6f44f272d0" Apr 24 21:28:54.797901 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.797871 2569 scope.go:117] "RemoveContainer" containerID="c7afbfbefb3b03399418e6f82b1f20315cbc8840afe39f97b3a07a1212c342b2" Apr 24 21:28:54.801203 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.801173 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf"] Apr 24 21:28:54.806484 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.806466 2569 scope.go:117] "RemoveContainer" containerID="82975e02ebe86d4d5f0145eaabb448d3cdd4b0f0ce67e603e73c5fc5a83d0f60" Apr 24 21:28:54.806796 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:28:54.806774 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82975e02ebe86d4d5f0145eaabb448d3cdd4b0f0ce67e603e73c5fc5a83d0f60\": container with ID starting with 82975e02ebe86d4d5f0145eaabb448d3cdd4b0f0ce67e603e73c5fc5a83d0f60 not found: ID does not exist" containerID="82975e02ebe86d4d5f0145eaabb448d3cdd4b0f0ce67e603e73c5fc5a83d0f60" Apr 24 21:28:54.806891 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.806803 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82975e02ebe86d4d5f0145eaabb448d3cdd4b0f0ce67e603e73c5fc5a83d0f60"} err="failed to get container status \"82975e02ebe86d4d5f0145eaabb448d3cdd4b0f0ce67e603e73c5fc5a83d0f60\": rpc error: code = NotFound desc = could not find container \"82975e02ebe86d4d5f0145eaabb448d3cdd4b0f0ce67e603e73c5fc5a83d0f60\": container with ID starting with 82975e02ebe86d4d5f0145eaabb448d3cdd4b0f0ce67e603e73c5fc5a83d0f60 not found: ID does not exist" Apr 24 21:28:54.806891 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.806822 2569 scope.go:117] "RemoveContainer" containerID="9fee4fe870a124386aeff04ba12490805bbbaab280ce7eb53e927a6f44f272d0" Apr 24 21:28:54.807090 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:28:54.807069 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fee4fe870a124386aeff04ba12490805bbbaab280ce7eb53e927a6f44f272d0\": container with ID starting with 9fee4fe870a124386aeff04ba12490805bbbaab280ce7eb53e927a6f44f272d0 not found: ID does not exist" containerID="9fee4fe870a124386aeff04ba12490805bbbaab280ce7eb53e927a6f44f272d0" Apr 24 21:28:54.807196 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.807090 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fee4fe870a124386aeff04ba12490805bbbaab280ce7eb53e927a6f44f272d0"} err="failed to get container status \"9fee4fe870a124386aeff04ba12490805bbbaab280ce7eb53e927a6f44f272d0\": rpc error: code = NotFound desc = could not find container \"9fee4fe870a124386aeff04ba12490805bbbaab280ce7eb53e927a6f44f272d0\": container with ID starting with 9fee4fe870a124386aeff04ba12490805bbbaab280ce7eb53e927a6f44f272d0 not found: ID does not exist" Apr 24 21:28:54.807196 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.807105 2569 scope.go:117] "RemoveContainer" containerID="c7afbfbefb3b03399418e6f82b1f20315cbc8840afe39f97b3a07a1212c342b2" Apr 24 21:28:54.807196 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.807130 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-07ea4-predictor-677dc57f77-2nfpf"] Apr 24 21:28:54.807360 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:28:54.807325 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7afbfbefb3b03399418e6f82b1f20315cbc8840afe39f97b3a07a1212c342b2\": container with ID starting with c7afbfbefb3b03399418e6f82b1f20315cbc8840afe39f97b3a07a1212c342b2 not found: ID does not exist" containerID="c7afbfbefb3b03399418e6f82b1f20315cbc8840afe39f97b3a07a1212c342b2" Apr 24 21:28:54.807360 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.807349 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7afbfbefb3b03399418e6f82b1f20315cbc8840afe39f97b3a07a1212c342b2"} err="failed to get container status \"c7afbfbefb3b03399418e6f82b1f20315cbc8840afe39f97b3a07a1212c342b2\": rpc error: code = NotFound desc = could not find container \"c7afbfbefb3b03399418e6f82b1f20315cbc8840afe39f97b3a07a1212c342b2\": container with ID starting with c7afbfbefb3b03399418e6f82b1f20315cbc8840afe39f97b3a07a1212c342b2 not found: ID does not exist" Apr 24 21:28:54.874253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:54.874230 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:28:55.058631 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.058544 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/679512e3-c45f-4fa4-8854-212d65ef1285-isvc-sklearn-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\") pod \"679512e3-c45f-4fa4-8854-212d65ef1285\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " Apr 24 21:28:55.058631 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.058584 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/679512e3-c45f-4fa4-8854-212d65ef1285-proxy-tls\") pod \"679512e3-c45f-4fa4-8854-212d65ef1285\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " Apr 24 21:28:55.058885 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.058648 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzl2t\" (UniqueName: \"kubernetes.io/projected/679512e3-c45f-4fa4-8854-212d65ef1285-kube-api-access-kzl2t\") pod \"679512e3-c45f-4fa4-8854-212d65ef1285\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " Apr 24 21:28:55.058885 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.058698 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/679512e3-c45f-4fa4-8854-212d65ef1285-kserve-provision-location\") pod \"679512e3-c45f-4fa4-8854-212d65ef1285\" (UID: \"679512e3-c45f-4fa4-8854-212d65ef1285\") " Apr 24 21:28:55.058997 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.058905 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/679512e3-c45f-4fa4-8854-212d65ef1285-isvc-sklearn-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config") pod "679512e3-c45f-4fa4-8854-212d65ef1285" (UID: "679512e3-c45f-4fa4-8854-212d65ef1285"). InnerVolumeSpecName "isvc-sklearn-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:28:55.059062 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.059042 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/679512e3-c45f-4fa4-8854-212d65ef1285-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "679512e3-c45f-4fa4-8854-212d65ef1285" (UID: "679512e3-c45f-4fa4-8854-212d65ef1285"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:28:55.060691 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.060662 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/679512e3-c45f-4fa4-8854-212d65ef1285-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "679512e3-c45f-4fa4-8854-212d65ef1285" (UID: "679512e3-c45f-4fa4-8854-212d65ef1285"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:28:55.060827 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.060714 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/679512e3-c45f-4fa4-8854-212d65ef1285-kube-api-access-kzl2t" (OuterVolumeSpecName: "kube-api-access-kzl2t") pod "679512e3-c45f-4fa4-8854-212d65ef1285" (UID: "679512e3-c45f-4fa4-8854-212d65ef1285"). InnerVolumeSpecName "kube-api-access-kzl2t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:28:55.159194 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.159168 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kzl2t\" (UniqueName: \"kubernetes.io/projected/679512e3-c45f-4fa4-8854-212d65ef1285-kube-api-access-kzl2t\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:28:55.159194 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.159190 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/679512e3-c45f-4fa4-8854-212d65ef1285-kserve-provision-location\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:28:55.159337 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.159201 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/679512e3-c45f-4fa4-8854-212d65ef1285-isvc-sklearn-graph-raw-hpa-07ea4-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:28:55.159337 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.159212 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/679512e3-c45f-4fa4-8854-212d65ef1285-proxy-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:28:55.774437 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.774403 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" event={"ID":"679512e3-c45f-4fa4-8854-212d65ef1285","Type":"ContainerDied","Data":"42a4b1be7f26ca87df64009bbe1d9023dfa9d0b9116473425e2ded56c8a94773"} Apr 24 21:28:55.774437 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.774429 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb" Apr 24 21:28:55.774929 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.774457 2569 scope.go:117] "RemoveContainer" containerID="929ab33c7eb3c58801339ba7788a5f7eeb1feeb84c0bd631cdc075b06c459261" Apr 24 21:28:55.783100 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.783059 2569 scope.go:117] "RemoveContainer" containerID="134a96822dab28fadd0af904a942df79829cf0de4ff9ea6000c3913a5cbf0bb2" Apr 24 21:28:55.789869 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.789853 2569 scope.go:117] "RemoveContainer" containerID="cfe2aaf45801b4b5ddbed3735f80a5951c7e61ac264b3bea633c0d94680972d2" Apr 24 21:28:55.796827 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.796809 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb"] Apr 24 21:28:55.803153 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:55.803134 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-07ea4-predictor-6c64cf7456-dphqb"] Apr 24 21:28:56.436483 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:56.436451 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" path="/var/lib/kubelet/pods/679512e3-c45f-4fa4-8854-212d65ef1285/volumes" Apr 24 21:28:56.436942 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:28:56.436928 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" path="/var/lib/kubelet/pods/7f8f1183-7c95-4acf-9967-2c2db7cbe81d/volumes" Apr 24 21:29:01.786677 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:01.786647 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:29:10.828327 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828295 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz"] Apr 24 21:29:10.828669 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828624 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kube-rbac-proxy" Apr 24 21:29:10.828669 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828635 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kube-rbac-proxy" Apr 24 21:29:10.828669 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828643 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kube-rbac-proxy" Apr 24 21:29:10.828669 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828649 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kube-rbac-proxy" Apr 24 21:29:10.828669 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828656 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kserve-container" Apr 24 21:29:10.828669 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828662 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kserve-container" Apr 24 21:29:10.828669 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828671 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="storage-initializer" Apr 24 21:29:10.828971 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828676 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="storage-initializer" Apr 24 21:29:10.828971 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828683 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kserve-container" Apr 24 21:29:10.828971 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828689 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kserve-container" Apr 24 21:29:10.828971 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828697 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="storage-initializer" Apr 24 21:29:10.828971 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828702 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="storage-initializer" Apr 24 21:29:10.828971 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828768 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kube-rbac-proxy" Apr 24 21:29:10.828971 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828779 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kube-rbac-proxy" Apr 24 21:29:10.828971 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828787 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="679512e3-c45f-4fa4-8854-212d65ef1285" containerName="kserve-container" Apr 24 21:29:10.828971 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.828794 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f8f1183-7c95-4acf-9967-2c2db7cbe81d" containerName="kserve-container" Apr 24 21:29:10.833669 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.833647 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:10.836189 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.836162 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-66937-kube-rbac-proxy-sar-config\"" Apr 24 21:29:10.836294 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.836262 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-66937-predictor-serving-cert\"" Apr 24 21:29:10.841658 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.841635 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz"] Apr 24 21:29:10.874961 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.874930 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-raw-66937-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/34f17ebf-3db3-42bf-af14-e93e366ee8be-isvc-logger-raw-66937-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:10.875058 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.875024 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34f17ebf-3db3-42bf-af14-e93e366ee8be-kserve-provision-location\") pod \"isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:10.875058 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.875047 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd8dz\" (UniqueName: \"kubernetes.io/projected/34f17ebf-3db3-42bf-af14-e93e366ee8be-kube-api-access-zd8dz\") pod \"isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:10.875135 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.875091 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34f17ebf-3db3-42bf-af14-e93e366ee8be-proxy-tls\") pod \"isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:10.976300 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.976272 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34f17ebf-3db3-42bf-af14-e93e366ee8be-kserve-provision-location\") pod \"isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:10.976401 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.976308 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zd8dz\" (UniqueName: \"kubernetes.io/projected/34f17ebf-3db3-42bf-af14-e93e366ee8be-kube-api-access-zd8dz\") pod \"isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:10.976401 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.976338 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34f17ebf-3db3-42bf-af14-e93e366ee8be-proxy-tls\") pod \"isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:10.976401 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.976380 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-raw-66937-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/34f17ebf-3db3-42bf-af14-e93e366ee8be-isvc-logger-raw-66937-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:10.976794 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.976750 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34f17ebf-3db3-42bf-af14-e93e366ee8be-kserve-provision-location\") pod \"isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:10.977032 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.977015 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-raw-66937-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/34f17ebf-3db3-42bf-af14-e93e366ee8be-isvc-logger-raw-66937-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:10.978672 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.978652 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34f17ebf-3db3-42bf-af14-e93e366ee8be-proxy-tls\") pod \"isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:10.985312 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:10.985291 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd8dz\" (UniqueName: \"kubernetes.io/projected/34f17ebf-3db3-42bf-af14-e93e366ee8be-kube-api-access-zd8dz\") pod \"isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:11.145045 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:11.144989 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:11.265003 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:11.264977 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz"] Apr 24 21:29:11.266877 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:29:11.266850 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34f17ebf_3db3_42bf_af14_e93e366ee8be.slice/crio-cef0016369dcf8efaa542fff24613b3d7828da6995392feccfe40172680aaca3 WatchSource:0}: Error finding container cef0016369dcf8efaa542fff24613b3d7828da6995392feccfe40172680aaca3: Status 404 returned error can't find the container with id cef0016369dcf8efaa542fff24613b3d7828da6995392feccfe40172680aaca3 Apr 24 21:29:11.827067 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:11.827029 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" event={"ID":"34f17ebf-3db3-42bf-af14-e93e366ee8be","Type":"ContainerStarted","Data":"047c70486bdd870decdabf7b489c54b0f1b847aea3017f71aaf48fe616b87222"} Apr 24 21:29:11.827067 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:11.827074 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" event={"ID":"34f17ebf-3db3-42bf-af14-e93e366ee8be","Type":"ContainerStarted","Data":"cef0016369dcf8efaa542fff24613b3d7828da6995392feccfe40172680aaca3"} Apr 24 21:29:15.841199 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:15.841160 2569 generic.go:358] "Generic (PLEG): container finished" podID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerID="047c70486bdd870decdabf7b489c54b0f1b847aea3017f71aaf48fe616b87222" exitCode=0 Apr 24 21:29:15.841555 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:15.841230 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" event={"ID":"34f17ebf-3db3-42bf-af14-e93e366ee8be","Type":"ContainerDied","Data":"047c70486bdd870decdabf7b489c54b0f1b847aea3017f71aaf48fe616b87222"} Apr 24 21:29:16.846804 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:16.846750 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" event={"ID":"34f17ebf-3db3-42bf-af14-e93e366ee8be","Type":"ContainerStarted","Data":"8202add44422ab8d72b2697f88f5df5df4e335406aafaba4e060b8e69934286f"} Apr 24 21:29:16.847246 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:16.846812 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" event={"ID":"34f17ebf-3db3-42bf-af14-e93e366ee8be","Type":"ContainerStarted","Data":"a72ad9f2596b6a0f6b6b70d260652e6da5bc7ccbaa9d61423fdab36fdf450dc0"} Apr 24 21:29:16.847246 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:16.846827 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" event={"ID":"34f17ebf-3db3-42bf-af14-e93e366ee8be","Type":"ContainerStarted","Data":"5b3badfd0a1b1e0c19f7d947f9ab71f7ce845fdfec3bdb03ae562e3178f9400d"} Apr 24 21:29:16.847246 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:16.847204 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:16.847412 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:16.847313 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:16.848316 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:16.848293 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:29:16.867264 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:16.867222 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podStartSLOduration=6.8672109500000005 podStartE2EDuration="6.86721095s" podCreationTimestamp="2026-04-24 21:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:16.865633979 +0000 UTC m=+745.000032599" watchObservedRunningTime="2026-04-24 21:29:16.86721095 +0000 UTC m=+745.001609572" Apr 24 21:29:17.850385 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:17.850348 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:17.850790 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:17.850490 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:29:17.851349 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:17.851328 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:18.853834 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:18.853796 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:29:18.854277 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:18.854251 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:23.858168 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:23.858143 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:29:23.863782 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:23.858730 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:29:23.863782 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:23.859177 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:33.858886 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:33.858843 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:29:33.859418 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:33.859394 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:43.859189 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:43.859144 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:29:43.859637 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:43.859559 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:53.859403 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:53.859352 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:29:53.859887 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:29:53.859852 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:03.859395 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:03.859353 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:30:03.859944 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:03.859872 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:13.859182 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:13.859143 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:30:13.859740 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:13.859715 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:23.858963 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:23.858930 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:30:23.859391 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:23.859182 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:30:35.857138 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:35.857103 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-66937-predictor-86db465969-zxjgs_cf53e686-46b5-470b-90cf-27b7cba22991/kserve-container/0.log" Apr 24 21:30:36.025096 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.025066 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz"] Apr 24 21:30:36.025462 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.025406 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" containerID="cri-o://5b3badfd0a1b1e0c19f7d947f9ab71f7ce845fdfec3bdb03ae562e3178f9400d" gracePeriod=30 Apr 24 21:30:36.025606 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.025438 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="agent" containerID="cri-o://8202add44422ab8d72b2697f88f5df5df4e335406aafaba4e060b8e69934286f" gracePeriod=30 Apr 24 21:30:36.025606 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.025451 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kube-rbac-proxy" containerID="cri-o://a72ad9f2596b6a0f6b6b70d260652e6da5bc7ccbaa9d61423fdab36fdf450dc0" gracePeriod=30 Apr 24 21:30:36.083734 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.083697 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg"] Apr 24 21:30:36.087501 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.087477 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:36.090041 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.090023 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-63746-predictor-serving-cert\"" Apr 24 21:30:36.090041 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.090030 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-63746-kube-rbac-proxy-sar-config\"" Apr 24 21:30:36.096462 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.096444 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg"] Apr 24 21:30:36.127414 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.127339 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs"] Apr 24 21:30:36.127666 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.127641 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" podUID="cf53e686-46b5-470b-90cf-27b7cba22991" containerName="kserve-container" containerID="cri-o://9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3" gracePeriod=30 Apr 24 21:30:36.127790 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.127702 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" podUID="cf53e686-46b5-470b-90cf-27b7cba22991" containerName="kube-rbac-proxy" containerID="cri-o://88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075" gracePeriod=30 Apr 24 21:30:36.194572 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.194536 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5jlc\" (UniqueName: \"kubernetes.io/projected/6112d18d-543e-4963-a027-c640879c9c06-kube-api-access-g5jlc\") pod \"isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:36.194710 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.194577 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-scale-raw-63746-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6112d18d-543e-4963-a027-c640879c9c06-isvc-sklearn-scale-raw-63746-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:36.194710 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.194673 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6112d18d-543e-4963-a027-c640879c9c06-proxy-tls\") pod \"isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:36.194710 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.194701 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6112d18d-543e-4963-a027-c640879c9c06-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:36.295173 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.295143 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6112d18d-543e-4963-a027-c640879c9c06-proxy-tls\") pod \"isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:36.295298 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.295192 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6112d18d-543e-4963-a027-c640879c9c06-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:36.295362 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.295297 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5jlc\" (UniqueName: \"kubernetes.io/projected/6112d18d-543e-4963-a027-c640879c9c06-kube-api-access-g5jlc\") pod \"isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:36.295362 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:30:36.295317 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-serving-cert: secret "isvc-sklearn-scale-raw-63746-predictor-serving-cert" not found Apr 24 21:30:36.295362 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.295334 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-scale-raw-63746-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6112d18d-543e-4963-a027-c640879c9c06-isvc-sklearn-scale-raw-63746-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:36.295503 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:30:36.295394 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6112d18d-543e-4963-a027-c640879c9c06-proxy-tls podName:6112d18d-543e-4963-a027-c640879c9c06 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:36.795372044 +0000 UTC m=+824.929770656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6112d18d-543e-4963-a027-c640879c9c06-proxy-tls") pod "isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" (UID: "6112d18d-543e-4963-a027-c640879c9c06") : secret "isvc-sklearn-scale-raw-63746-predictor-serving-cert" not found Apr 24 21:30:36.296004 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.295973 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6112d18d-543e-4963-a027-c640879c9c06-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:36.296130 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.296107 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-scale-raw-63746-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6112d18d-543e-4963-a027-c640879c9c06-isvc-sklearn-scale-raw-63746-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:36.304454 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.304432 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5jlc\" (UniqueName: \"kubernetes.io/projected/6112d18d-543e-4963-a027-c640879c9c06-kube-api-access-g5jlc\") pod \"isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:36.361720 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.361700 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:30:36.497341 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.497247 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjhgc\" (UniqueName: \"kubernetes.io/projected/cf53e686-46b5-470b-90cf-27b7cba22991-kube-api-access-kjhgc\") pod \"cf53e686-46b5-470b-90cf-27b7cba22991\" (UID: \"cf53e686-46b5-470b-90cf-27b7cba22991\") " Apr 24 21:30:36.497341 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.497318 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-raw-66937-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cf53e686-46b5-470b-90cf-27b7cba22991-message-dumper-raw-66937-kube-rbac-proxy-sar-config\") pod \"cf53e686-46b5-470b-90cf-27b7cba22991\" (UID: \"cf53e686-46b5-470b-90cf-27b7cba22991\") " Apr 24 21:30:36.497555 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.497366 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf53e686-46b5-470b-90cf-27b7cba22991-proxy-tls\") pod \"cf53e686-46b5-470b-90cf-27b7cba22991\" (UID: \"cf53e686-46b5-470b-90cf-27b7cba22991\") " Apr 24 21:30:36.497701 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.497665 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf53e686-46b5-470b-90cf-27b7cba22991-message-dumper-raw-66937-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-raw-66937-kube-rbac-proxy-sar-config") pod "cf53e686-46b5-470b-90cf-27b7cba22991" (UID: "cf53e686-46b5-470b-90cf-27b7cba22991"). InnerVolumeSpecName "message-dumper-raw-66937-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:36.499313 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.499290 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf53e686-46b5-470b-90cf-27b7cba22991-kube-api-access-kjhgc" (OuterVolumeSpecName: "kube-api-access-kjhgc") pod "cf53e686-46b5-470b-90cf-27b7cba22991" (UID: "cf53e686-46b5-470b-90cf-27b7cba22991"). InnerVolumeSpecName "kube-api-access-kjhgc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:36.499373 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.499340 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf53e686-46b5-470b-90cf-27b7cba22991-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cf53e686-46b5-470b-90cf-27b7cba22991" (UID: "cf53e686-46b5-470b-90cf-27b7cba22991"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:36.598614 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.598582 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf53e686-46b5-470b-90cf-27b7cba22991-proxy-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:30:36.598614 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.598608 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kjhgc\" (UniqueName: \"kubernetes.io/projected/cf53e686-46b5-470b-90cf-27b7cba22991-kube-api-access-kjhgc\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:30:36.598614 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.598618 2569 reconciler_common.go:299] "Volume detached for volume \"message-dumper-raw-66937-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cf53e686-46b5-470b-90cf-27b7cba22991-message-dumper-raw-66937-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:30:36.799810 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:36.799778 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6112d18d-543e-4963-a027-c640879c9c06-proxy-tls\") pod \"isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:36.799977 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:30:36.799942 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-serving-cert: secret "isvc-sklearn-scale-raw-63746-predictor-serving-cert" not found Apr 24 21:30:36.800041 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:30:36.800018 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6112d18d-543e-4963-a027-c640879c9c06-proxy-tls podName:6112d18d-543e-4963-a027-c640879c9c06 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:37.799998454 +0000 UTC m=+825.934397070 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6112d18d-543e-4963-a027-c640879c9c06-proxy-tls") pod "isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" (UID: "6112d18d-543e-4963-a027-c640879c9c06") : secret "isvc-sklearn-scale-raw-63746-predictor-serving-cert" not found Apr 24 21:30:37.095201 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.095124 2569 generic.go:358] "Generic (PLEG): container finished" podID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerID="a72ad9f2596b6a0f6b6b70d260652e6da5bc7ccbaa9d61423fdab36fdf450dc0" exitCode=2 Apr 24 21:30:37.095551 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.095192 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" event={"ID":"34f17ebf-3db3-42bf-af14-e93e366ee8be","Type":"ContainerDied","Data":"a72ad9f2596b6a0f6b6b70d260652e6da5bc7ccbaa9d61423fdab36fdf450dc0"} Apr 24 21:30:37.099184 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.099161 2569 generic.go:358] "Generic (PLEG): container finished" podID="cf53e686-46b5-470b-90cf-27b7cba22991" containerID="88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075" exitCode=2 Apr 24 21:30:37.099184 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.099184 2569 generic.go:358] "Generic (PLEG): container finished" podID="cf53e686-46b5-470b-90cf-27b7cba22991" containerID="9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3" exitCode=2 Apr 24 21:30:37.099367 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.099204 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" event={"ID":"cf53e686-46b5-470b-90cf-27b7cba22991","Type":"ContainerDied","Data":"88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075"} Apr 24 21:30:37.099367 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.099228 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" event={"ID":"cf53e686-46b5-470b-90cf-27b7cba22991","Type":"ContainerDied","Data":"9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3"} Apr 24 21:30:37.099367 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.099241 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" event={"ID":"cf53e686-46b5-470b-90cf-27b7cba22991","Type":"ContainerDied","Data":"5ca68c2784417db9d6305992365d2355315702128cf84cfdb47870c2c796d2f9"} Apr 24 21:30:37.099367 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.099249 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs" Apr 24 21:30:37.099367 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.099259 2569 scope.go:117] "RemoveContainer" containerID="88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075" Apr 24 21:30:37.108034 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.108018 2569 scope.go:117] "RemoveContainer" containerID="9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3" Apr 24 21:30:37.115271 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.115253 2569 scope.go:117] "RemoveContainer" containerID="88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075" Apr 24 21:30:37.115516 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:30:37.115499 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075\": container with ID starting with 88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075 not found: ID does not exist" containerID="88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075" Apr 24 21:30:37.115559 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.115524 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075"} err="failed to get container status \"88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075\": rpc error: code = NotFound desc = could not find container \"88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075\": container with ID starting with 88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075 not found: ID does not exist" Apr 24 21:30:37.115559 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.115541 2569 scope.go:117] "RemoveContainer" containerID="9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3" Apr 24 21:30:37.115778 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:30:37.115744 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3\": container with ID starting with 9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3 not found: ID does not exist" containerID="9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3" Apr 24 21:30:37.115873 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.115779 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3"} err="failed to get container status \"9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3\": rpc error: code = NotFound desc = could not find container \"9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3\": container with ID starting with 9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3 not found: ID does not exist" Apr 24 21:30:37.115873 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.115792 2569 scope.go:117] "RemoveContainer" containerID="88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075" Apr 24 21:30:37.115998 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.115981 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075"} err="failed to get container status \"88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075\": rpc error: code = NotFound desc = could not find container \"88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075\": container with ID starting with 88b675288282e438b8178f137efce22df7d9576cd6501cfb84a7b4d7b1a65075 not found: ID does not exist" Apr 24 21:30:37.116038 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.115998 2569 scope.go:117] "RemoveContainer" containerID="9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3" Apr 24 21:30:37.116183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.116167 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3"} err="failed to get container status \"9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3\": rpc error: code = NotFound desc = could not find container \"9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3\": container with ID starting with 9a9e24ab31fb2229948b6765a92ea81e13196d4ba6777d5a1db7e6ee5173e9d3 not found: ID does not exist" Apr 24 21:30:37.120693 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.120673 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs"] Apr 24 21:30:37.122150 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.122127 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-66937-predictor-86db465969-zxjgs"] Apr 24 21:30:37.807710 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.807667 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6112d18d-543e-4963-a027-c640879c9c06-proxy-tls\") pod \"isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:37.809881 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.809855 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6112d18d-543e-4963-a027-c640879c9c06-proxy-tls\") pod \"isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:37.899329 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:37.899299 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:38.020443 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:38.020421 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg"] Apr 24 21:30:38.022486 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:30:38.022458 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6112d18d_543e_4963_a027_c640879c9c06.slice/crio-d6ef28f12673feddeb2f89df7e10fcfadc4e18cb051c9eeb07f9a6c253885a05 WatchSource:0}: Error finding container d6ef28f12673feddeb2f89df7e10fcfadc4e18cb051c9eeb07f9a6c253885a05: Status 404 returned error can't find the container with id d6ef28f12673feddeb2f89df7e10fcfadc4e18cb051c9eeb07f9a6c253885a05 Apr 24 21:30:38.103631 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:38.103603 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" event={"ID":"6112d18d-543e-4963-a027-c640879c9c06","Type":"ContainerStarted","Data":"c5a9b2a7c4261cfaafa5778defa496167f72d2f34d2c70ae2d020e0ebb8ea847"} Apr 24 21:30:38.104023 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:38.103638 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" event={"ID":"6112d18d-543e-4963-a027-c640879c9c06","Type":"ContainerStarted","Data":"d6ef28f12673feddeb2f89df7e10fcfadc4e18cb051c9eeb07f9a6c253885a05"} Apr 24 21:30:38.435979 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:38.435907 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf53e686-46b5-470b-90cf-27b7cba22991" path="/var/lib/kubelet/pods/cf53e686-46b5-470b-90cf-27b7cba22991/volumes" Apr 24 21:30:38.854230 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:38.854190 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 24 21:30:40.111538 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:40.111505 2569 generic.go:358] "Generic (PLEG): container finished" podID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerID="5b3badfd0a1b1e0c19f7d947f9ab71f7ce845fdfec3bdb03ae562e3178f9400d" exitCode=0 Apr 24 21:30:40.111913 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:40.111577 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" event={"ID":"34f17ebf-3db3-42bf-af14-e93e366ee8be","Type":"ContainerDied","Data":"5b3badfd0a1b1e0c19f7d947f9ab71f7ce845fdfec3bdb03ae562e3178f9400d"} Apr 24 21:30:42.119831 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:42.119797 2569 generic.go:358] "Generic (PLEG): container finished" podID="6112d18d-543e-4963-a027-c640879c9c06" containerID="c5a9b2a7c4261cfaafa5778defa496167f72d2f34d2c70ae2d020e0ebb8ea847" exitCode=0 Apr 24 21:30:42.120249 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:42.119837 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" event={"ID":"6112d18d-543e-4963-a027-c640879c9c06","Type":"ContainerDied","Data":"c5a9b2a7c4261cfaafa5778defa496167f72d2f34d2c70ae2d020e0ebb8ea847"} Apr 24 21:30:43.124874 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:43.124834 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" event={"ID":"6112d18d-543e-4963-a027-c640879c9c06","Type":"ContainerStarted","Data":"f5e3dedf3265888ee2dfc5af772a5f46dbbd7e4bfd80a468b09b601a4a1758fb"} Apr 24 21:30:43.124874 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:43.124870 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" event={"ID":"6112d18d-543e-4963-a027-c640879c9c06","Type":"ContainerStarted","Data":"452d0301f4f871a763b95896398d5c03b7b8b3113c4b84e5ed7f9a8c387f1d5f"} Apr 24 21:30:43.125251 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:43.125073 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:43.145395 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:43.145346 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podStartSLOduration=7.145330398 podStartE2EDuration="7.145330398s" podCreationTimestamp="2026-04-24 21:30:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:43.144088115 +0000 UTC m=+831.278486737" watchObservedRunningTime="2026-04-24 21:30:43.145330398 +0000 UTC m=+831.279729024" Apr 24 21:30:43.854408 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:43.854366 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 24 21:30:43.858782 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:43.858726 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:30:43.859082 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:43.859056 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:44.128143 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:44.128073 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:44.129246 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:44.129210 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:30:45.131360 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:45.131316 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:30:48.854379 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:48.854331 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 24 21:30:48.854798 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:48.854470 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:30:50.135413 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:50.135384 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:30:50.135875 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:50.135850 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:30:53.854620 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:53.854581 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 24 21:30:53.858988 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:53.858959 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:30:53.859287 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:53.859266 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:58.854166 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:30:58.854124 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 24 21:31:00.136842 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:00.136795 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:31:03.854641 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:03.854604 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 24 21:31:03.859048 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:03.859026 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:31:03.859201 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:03.859183 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:31:03.859405 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:03.859385 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:31:03.859482 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:03.859472 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:31:06.163013 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.162991 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:31:06.196202 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.196173 2569 generic.go:358] "Generic (PLEG): container finished" podID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerID="8202add44422ab8d72b2697f88f5df5df4e335406aafaba4e060b8e69934286f" exitCode=0 Apr 24 21:31:06.196341 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.196248 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" event={"ID":"34f17ebf-3db3-42bf-af14-e93e366ee8be","Type":"ContainerDied","Data":"8202add44422ab8d72b2697f88f5df5df4e335406aafaba4e060b8e69934286f"} Apr 24 21:31:06.196341 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.196278 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" event={"ID":"34f17ebf-3db3-42bf-af14-e93e366ee8be","Type":"ContainerDied","Data":"cef0016369dcf8efaa542fff24613b3d7828da6995392feccfe40172680aaca3"} Apr 24 21:31:06.196341 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.196279 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz" Apr 24 21:31:06.196341 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.196303 2569 scope.go:117] "RemoveContainer" containerID="8202add44422ab8d72b2697f88f5df5df4e335406aafaba4e060b8e69934286f" Apr 24 21:31:06.204946 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.204931 2569 scope.go:117] "RemoveContainer" containerID="a72ad9f2596b6a0f6b6b70d260652e6da5bc7ccbaa9d61423fdab36fdf450dc0" Apr 24 21:31:06.211882 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.211865 2569 scope.go:117] "RemoveContainer" containerID="5b3badfd0a1b1e0c19f7d947f9ab71f7ce845fdfec3bdb03ae562e3178f9400d" Apr 24 21:31:06.218127 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.218114 2569 scope.go:117] "RemoveContainer" containerID="047c70486bdd870decdabf7b489c54b0f1b847aea3017f71aaf48fe616b87222" Apr 24 21:31:06.224420 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.224403 2569 scope.go:117] "RemoveContainer" containerID="8202add44422ab8d72b2697f88f5df5df4e335406aafaba4e060b8e69934286f" Apr 24 21:31:06.224652 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:31:06.224635 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8202add44422ab8d72b2697f88f5df5df4e335406aafaba4e060b8e69934286f\": container with ID starting with 8202add44422ab8d72b2697f88f5df5df4e335406aafaba4e060b8e69934286f not found: ID does not exist" containerID="8202add44422ab8d72b2697f88f5df5df4e335406aafaba4e060b8e69934286f" Apr 24 21:31:06.224700 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.224660 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8202add44422ab8d72b2697f88f5df5df4e335406aafaba4e060b8e69934286f"} err="failed to get container status \"8202add44422ab8d72b2697f88f5df5df4e335406aafaba4e060b8e69934286f\": rpc error: code = NotFound desc = could not find container \"8202add44422ab8d72b2697f88f5df5df4e335406aafaba4e060b8e69934286f\": container with ID starting with 8202add44422ab8d72b2697f88f5df5df4e335406aafaba4e060b8e69934286f not found: ID does not exist" Apr 24 21:31:06.224700 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.224679 2569 scope.go:117] "RemoveContainer" containerID="a72ad9f2596b6a0f6b6b70d260652e6da5bc7ccbaa9d61423fdab36fdf450dc0" Apr 24 21:31:06.224935 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:31:06.224915 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a72ad9f2596b6a0f6b6b70d260652e6da5bc7ccbaa9d61423fdab36fdf450dc0\": container with ID starting with a72ad9f2596b6a0f6b6b70d260652e6da5bc7ccbaa9d61423fdab36fdf450dc0 not found: ID does not exist" containerID="a72ad9f2596b6a0f6b6b70d260652e6da5bc7ccbaa9d61423fdab36fdf450dc0" Apr 24 21:31:06.225002 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.224945 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a72ad9f2596b6a0f6b6b70d260652e6da5bc7ccbaa9d61423fdab36fdf450dc0"} err="failed to get container status \"a72ad9f2596b6a0f6b6b70d260652e6da5bc7ccbaa9d61423fdab36fdf450dc0\": rpc error: code = NotFound desc = could not find container \"a72ad9f2596b6a0f6b6b70d260652e6da5bc7ccbaa9d61423fdab36fdf450dc0\": container with ID starting with a72ad9f2596b6a0f6b6b70d260652e6da5bc7ccbaa9d61423fdab36fdf450dc0 not found: ID does not exist" Apr 24 21:31:06.225002 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.224967 2569 scope.go:117] "RemoveContainer" containerID="5b3badfd0a1b1e0c19f7d947f9ab71f7ce845fdfec3bdb03ae562e3178f9400d" Apr 24 21:31:06.225214 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:31:06.225198 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b3badfd0a1b1e0c19f7d947f9ab71f7ce845fdfec3bdb03ae562e3178f9400d\": container with ID starting with 5b3badfd0a1b1e0c19f7d947f9ab71f7ce845fdfec3bdb03ae562e3178f9400d not found: ID does not exist" containerID="5b3badfd0a1b1e0c19f7d947f9ab71f7ce845fdfec3bdb03ae562e3178f9400d" Apr 24 21:31:06.225253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.225219 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b3badfd0a1b1e0c19f7d947f9ab71f7ce845fdfec3bdb03ae562e3178f9400d"} err="failed to get container status \"5b3badfd0a1b1e0c19f7d947f9ab71f7ce845fdfec3bdb03ae562e3178f9400d\": rpc error: code = NotFound desc = could not find container \"5b3badfd0a1b1e0c19f7d947f9ab71f7ce845fdfec3bdb03ae562e3178f9400d\": container with ID starting with 5b3badfd0a1b1e0c19f7d947f9ab71f7ce845fdfec3bdb03ae562e3178f9400d not found: ID does not exist" Apr 24 21:31:06.225253 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.225233 2569 scope.go:117] "RemoveContainer" containerID="047c70486bdd870decdabf7b489c54b0f1b847aea3017f71aaf48fe616b87222" Apr 24 21:31:06.225441 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:31:06.225425 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"047c70486bdd870decdabf7b489c54b0f1b847aea3017f71aaf48fe616b87222\": container with ID starting with 047c70486bdd870decdabf7b489c54b0f1b847aea3017f71aaf48fe616b87222 not found: ID does not exist" containerID="047c70486bdd870decdabf7b489c54b0f1b847aea3017f71aaf48fe616b87222" Apr 24 21:31:06.225483 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.225446 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"047c70486bdd870decdabf7b489c54b0f1b847aea3017f71aaf48fe616b87222"} err="failed to get container status \"047c70486bdd870decdabf7b489c54b0f1b847aea3017f71aaf48fe616b87222\": rpc error: code = NotFound desc = could not find container \"047c70486bdd870decdabf7b489c54b0f1b847aea3017f71aaf48fe616b87222\": container with ID starting with 047c70486bdd870decdabf7b489c54b0f1b847aea3017f71aaf48fe616b87222 not found: ID does not exist" Apr 24 21:31:06.226649 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.226633 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd8dz\" (UniqueName: \"kubernetes.io/projected/34f17ebf-3db3-42bf-af14-e93e366ee8be-kube-api-access-zd8dz\") pod \"34f17ebf-3db3-42bf-af14-e93e366ee8be\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " Apr 24 21:31:06.226730 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.226671 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34f17ebf-3db3-42bf-af14-e93e366ee8be-proxy-tls\") pod \"34f17ebf-3db3-42bf-af14-e93e366ee8be\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " Apr 24 21:31:06.226866 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.226749 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-raw-66937-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/34f17ebf-3db3-42bf-af14-e93e366ee8be-isvc-logger-raw-66937-kube-rbac-proxy-sar-config\") pod \"34f17ebf-3db3-42bf-af14-e93e366ee8be\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " Apr 24 21:31:06.226943 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.226889 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34f17ebf-3db3-42bf-af14-e93e366ee8be-kserve-provision-location\") pod \"34f17ebf-3db3-42bf-af14-e93e366ee8be\" (UID: \"34f17ebf-3db3-42bf-af14-e93e366ee8be\") " Apr 24 21:31:06.227156 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.227130 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34f17ebf-3db3-42bf-af14-e93e366ee8be-isvc-logger-raw-66937-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-raw-66937-kube-rbac-proxy-sar-config") pod "34f17ebf-3db3-42bf-af14-e93e366ee8be" (UID: "34f17ebf-3db3-42bf-af14-e93e366ee8be"). InnerVolumeSpecName "isvc-logger-raw-66937-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:06.227226 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.227157 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f17ebf-3db3-42bf-af14-e93e366ee8be-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "34f17ebf-3db3-42bf-af14-e93e366ee8be" (UID: "34f17ebf-3db3-42bf-af14-e93e366ee8be"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:06.228459 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.228441 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34f17ebf-3db3-42bf-af14-e93e366ee8be-kube-api-access-zd8dz" (OuterVolumeSpecName: "kube-api-access-zd8dz") pod "34f17ebf-3db3-42bf-af14-e93e366ee8be" (UID: "34f17ebf-3db3-42bf-af14-e93e366ee8be"). InnerVolumeSpecName "kube-api-access-zd8dz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:06.228625 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.228605 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34f17ebf-3db3-42bf-af14-e93e366ee8be-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "34f17ebf-3db3-42bf-af14-e93e366ee8be" (UID: "34f17ebf-3db3-42bf-af14-e93e366ee8be"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:06.328524 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.328451 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zd8dz\" (UniqueName: \"kubernetes.io/projected/34f17ebf-3db3-42bf-af14-e93e366ee8be-kube-api-access-zd8dz\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:31:06.328524 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.328477 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34f17ebf-3db3-42bf-af14-e93e366ee8be-proxy-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:31:06.328524 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.328492 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-raw-66937-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/34f17ebf-3db3-42bf-af14-e93e366ee8be-isvc-logger-raw-66937-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:31:06.328524 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.328506 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34f17ebf-3db3-42bf-af14-e93e366ee8be-kserve-provision-location\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:31:06.516101 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.516068 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz"] Apr 24 21:31:06.519904 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:06.519879 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-66937-predictor-56b7d5bf78-2nrcz"] Apr 24 21:31:08.436285 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:08.436247 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" path="/var/lib/kubelet/pods/34f17ebf-3db3-42bf-af14-e93e366ee8be/volumes" Apr 24 21:31:10.136295 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:10.136249 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:31:20.136471 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:20.136427 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:31:30.136020 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:30.135976 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:31:40.135923 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:40.135876 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:31:50.136960 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:31:50.136869 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:32:00.136043 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:32:00.136002 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:32:10.136629 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:32:10.136586 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:32:16.433273 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:32:16.433221 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:32:26.433468 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:32:26.433425 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:32:36.433183 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:32:36.433135 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:32:46.433620 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:32:46.433577 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:32:56.435747 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:32:56.435714 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:33:06.242010 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.241975 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg"] Apr 24 21:33:06.242365 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.242291 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" containerID="cri-o://452d0301f4f871a763b95896398d5c03b7b8b3113c4b84e5ed7f9a8c387f1d5f" gracePeriod=30 Apr 24 21:33:06.242418 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.242326 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kube-rbac-proxy" containerID="cri-o://f5e3dedf3265888ee2dfc5af772a5f46dbbd7e4bfd80a468b09b601a4a1758fb" gracePeriod=30 Apr 24 21:33:06.375248 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375221 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd"] Apr 24 21:33:06.375592 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375577 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="agent" Apr 24 21:33:06.375640 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375596 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="agent" Apr 24 21:33:06.375640 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375611 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kube-rbac-proxy" Apr 24 21:33:06.375640 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375617 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kube-rbac-proxy" Apr 24 21:33:06.375640 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375633 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" Apr 24 21:33:06.375640 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375639 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" Apr 24 21:33:06.375811 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375649 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="storage-initializer" Apr 24 21:33:06.375811 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375654 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="storage-initializer" Apr 24 21:33:06.375811 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375660 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf53e686-46b5-470b-90cf-27b7cba22991" containerName="kserve-container" Apr 24 21:33:06.375811 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375665 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf53e686-46b5-470b-90cf-27b7cba22991" containerName="kserve-container" Apr 24 21:33:06.375811 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375676 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf53e686-46b5-470b-90cf-27b7cba22991" containerName="kube-rbac-proxy" Apr 24 21:33:06.375811 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375682 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf53e686-46b5-470b-90cf-27b7cba22991" containerName="kube-rbac-proxy" Apr 24 21:33:06.375811 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375740 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf53e686-46b5-470b-90cf-27b7cba22991" containerName="kserve-container" Apr 24 21:33:06.375811 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375748 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="agent" Apr 24 21:33:06.375811 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375780 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kserve-container" Apr 24 21:33:06.375811 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375790 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf53e686-46b5-470b-90cf-27b7cba22991" containerName="kube-rbac-proxy" Apr 24 21:33:06.375811 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.375798 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="34f17ebf-3db3-42bf-af14-e93e366ee8be" containerName="kube-rbac-proxy" Apr 24 21:33:06.378855 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.378836 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:06.381746 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.381728 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-8d5709-predictor-serving-cert\"" Apr 24 21:33:06.381913 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.381897 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-8d5709-kube-rbac-proxy-sar-config\"" Apr 24 21:33:06.391857 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.391837 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd"] Apr 24 21:33:06.433325 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.433289 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:33:06.480230 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.480200 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zcmg\" (UniqueName: \"kubernetes.io/projected/63d2248f-f7e9-4ed0-b369-0d40a0b04454-kube-api-access-5zcmg\") pod \"isvc-primary-8d5709-predictor-59cbfc9977-bkkhd\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:06.480505 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.480268 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63d2248f-f7e9-4ed0-b369-0d40a0b04454-proxy-tls\") pod \"isvc-primary-8d5709-predictor-59cbfc9977-bkkhd\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:06.480505 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.480365 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-8d5709-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/63d2248f-f7e9-4ed0-b369-0d40a0b04454-isvc-primary-8d5709-kube-rbac-proxy-sar-config\") pod \"isvc-primary-8d5709-predictor-59cbfc9977-bkkhd\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:06.480505 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.480403 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63d2248f-f7e9-4ed0-b369-0d40a0b04454-kserve-provision-location\") pod \"isvc-primary-8d5709-predictor-59cbfc9977-bkkhd\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:06.536575 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.536540 2569 generic.go:358] "Generic (PLEG): container finished" podID="6112d18d-543e-4963-a027-c640879c9c06" containerID="f5e3dedf3265888ee2dfc5af772a5f46dbbd7e4bfd80a468b09b601a4a1758fb" exitCode=2 Apr 24 21:33:06.536705 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.536607 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" event={"ID":"6112d18d-543e-4963-a027-c640879c9c06","Type":"ContainerDied","Data":"f5e3dedf3265888ee2dfc5af772a5f46dbbd7e4bfd80a468b09b601a4a1758fb"} Apr 24 21:33:06.581048 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.581025 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zcmg\" (UniqueName: \"kubernetes.io/projected/63d2248f-f7e9-4ed0-b369-0d40a0b04454-kube-api-access-5zcmg\") pod \"isvc-primary-8d5709-predictor-59cbfc9977-bkkhd\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:06.581138 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.581074 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63d2248f-f7e9-4ed0-b369-0d40a0b04454-proxy-tls\") pod \"isvc-primary-8d5709-predictor-59cbfc9977-bkkhd\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:06.581257 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.581240 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-8d5709-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/63d2248f-f7e9-4ed0-b369-0d40a0b04454-isvc-primary-8d5709-kube-rbac-proxy-sar-config\") pod \"isvc-primary-8d5709-predictor-59cbfc9977-bkkhd\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:06.581305 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.581273 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63d2248f-f7e9-4ed0-b369-0d40a0b04454-kserve-provision-location\") pod \"isvc-primary-8d5709-predictor-59cbfc9977-bkkhd\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:06.581597 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.581580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63d2248f-f7e9-4ed0-b369-0d40a0b04454-kserve-provision-location\") pod \"isvc-primary-8d5709-predictor-59cbfc9977-bkkhd\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:06.581886 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.581865 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-8d5709-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/63d2248f-f7e9-4ed0-b369-0d40a0b04454-isvc-primary-8d5709-kube-rbac-proxy-sar-config\") pod \"isvc-primary-8d5709-predictor-59cbfc9977-bkkhd\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:06.583322 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.583305 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63d2248f-f7e9-4ed0-b369-0d40a0b04454-proxy-tls\") pod \"isvc-primary-8d5709-predictor-59cbfc9977-bkkhd\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:06.589872 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.589849 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zcmg\" (UniqueName: \"kubernetes.io/projected/63d2248f-f7e9-4ed0-b369-0d40a0b04454-kube-api-access-5zcmg\") pod \"isvc-primary-8d5709-predictor-59cbfc9977-bkkhd\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:06.688949 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.688922 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:06.808982 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:06.808959 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd"] Apr 24 21:33:06.811403 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:33:06.811377 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63d2248f_f7e9_4ed0_b369_0d40a0b04454.slice/crio-0e6943472526b3a52cf1b569ca73b582644b6404d12b0ccd87b60265a7a56c23 WatchSource:0}: Error finding container 0e6943472526b3a52cf1b569ca73b582644b6404d12b0ccd87b60265a7a56c23: Status 404 returned error can't find the container with id 0e6943472526b3a52cf1b569ca73b582644b6404d12b0ccd87b60265a7a56c23 Apr 24 21:33:07.541661 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:07.541619 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" event={"ID":"63d2248f-f7e9-4ed0-b369-0d40a0b04454","Type":"ContainerStarted","Data":"8880e20f4f0f0c85518dc9dee678571645cee637b63e3ff5c638dc49ab090971"} Apr 24 21:33:07.541661 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:07.541663 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" event={"ID":"63d2248f-f7e9-4ed0-b369-0d40a0b04454","Type":"ContainerStarted","Data":"0e6943472526b3a52cf1b569ca73b582644b6404d12b0ccd87b60265a7a56c23"} Apr 24 21:33:10.132492 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:10.132452 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 24 21:33:11.554983 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:11.554915 2569 generic.go:358] "Generic (PLEG): container finished" podID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerID="8880e20f4f0f0c85518dc9dee678571645cee637b63e3ff5c638dc49ab090971" exitCode=0 Apr 24 21:33:11.554983 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:11.554973 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" event={"ID":"63d2248f-f7e9-4ed0-b369-0d40a0b04454","Type":"ContainerDied","Data":"8880e20f4f0f0c85518dc9dee678571645cee637b63e3ff5c638dc49ab090971"} Apr 24 21:33:12.559359 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:12.559324 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" event={"ID":"63d2248f-f7e9-4ed0-b369-0d40a0b04454","Type":"ContainerStarted","Data":"4b2feb7cfae72b910dbe8cab1f60084dfa7cbc876ad8dcbae8d4aed0ed59fa0a"} Apr 24 21:33:12.559359 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:12.559364 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" event={"ID":"63d2248f-f7e9-4ed0-b369-0d40a0b04454","Type":"ContainerStarted","Data":"e06d439183a93c884670ec71c5ea5e5b587dbafb5f07d2453edbcbaf6fe89a88"} Apr 24 21:33:12.559788 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:12.559563 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:12.583633 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:12.583584 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" podStartSLOduration=6.583571595 podStartE2EDuration="6.583571595s" podCreationTimestamp="2026-04-24 21:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:33:12.581492932 +0000 UTC m=+980.715891554" watchObservedRunningTime="2026-04-24 21:33:12.583571595 +0000 UTC m=+980.717970217" Apr 24 21:33:13.563093 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:13.563065 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:13.564174 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:13.564147 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:33:14.567315 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.567287 2569 generic.go:358] "Generic (PLEG): container finished" podID="6112d18d-543e-4963-a027-c640879c9c06" containerID="452d0301f4f871a763b95896398d5c03b7b8b3113c4b84e5ed7f9a8c387f1d5f" exitCode=0 Apr 24 21:33:14.567597 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.567361 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" event={"ID":"6112d18d-543e-4963-a027-c640879c9c06","Type":"ContainerDied","Data":"452d0301f4f871a763b95896398d5c03b7b8b3113c4b84e5ed7f9a8c387f1d5f"} Apr 24 21:33:14.567683 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.567663 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:33:14.580875 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.580859 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:33:14.651505 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.651484 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6112d18d-543e-4963-a027-c640879c9c06-proxy-tls\") pod \"6112d18d-543e-4963-a027-c640879c9c06\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " Apr 24 21:33:14.651622 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.651521 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6112d18d-543e-4963-a027-c640879c9c06-kserve-provision-location\") pod \"6112d18d-543e-4963-a027-c640879c9c06\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " Apr 24 21:33:14.651622 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.651578 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5jlc\" (UniqueName: \"kubernetes.io/projected/6112d18d-543e-4963-a027-c640879c9c06-kube-api-access-g5jlc\") pod \"6112d18d-543e-4963-a027-c640879c9c06\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " Apr 24 21:33:14.651699 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.651629 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-scale-raw-63746-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6112d18d-543e-4963-a027-c640879c9c06-isvc-sklearn-scale-raw-63746-kube-rbac-proxy-sar-config\") pod \"6112d18d-543e-4963-a027-c640879c9c06\" (UID: \"6112d18d-543e-4963-a027-c640879c9c06\") " Apr 24 21:33:14.651938 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.651908 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6112d18d-543e-4963-a027-c640879c9c06-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6112d18d-543e-4963-a027-c640879c9c06" (UID: "6112d18d-543e-4963-a027-c640879c9c06"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:14.652067 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.651990 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6112d18d-543e-4963-a027-c640879c9c06-isvc-sklearn-scale-raw-63746-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-scale-raw-63746-kube-rbac-proxy-sar-config") pod "6112d18d-543e-4963-a027-c640879c9c06" (UID: "6112d18d-543e-4963-a027-c640879c9c06"). InnerVolumeSpecName "isvc-sklearn-scale-raw-63746-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:14.652067 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.652053 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6112d18d-543e-4963-a027-c640879c9c06-kserve-provision-location\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:33:14.653709 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.653679 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6112d18d-543e-4963-a027-c640879c9c06-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6112d18d-543e-4963-a027-c640879c9c06" (UID: "6112d18d-543e-4963-a027-c640879c9c06"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:14.653829 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.653777 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6112d18d-543e-4963-a027-c640879c9c06-kube-api-access-g5jlc" (OuterVolumeSpecName: "kube-api-access-g5jlc") pod "6112d18d-543e-4963-a027-c640879c9c06" (UID: "6112d18d-543e-4963-a027-c640879c9c06"). InnerVolumeSpecName "kube-api-access-g5jlc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:14.753058 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.752988 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6112d18d-543e-4963-a027-c640879c9c06-proxy-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:33:14.753058 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.753012 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5jlc\" (UniqueName: \"kubernetes.io/projected/6112d18d-543e-4963-a027-c640879c9c06-kube-api-access-g5jlc\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:33:14.753058 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:14.753025 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-scale-raw-63746-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6112d18d-543e-4963-a027-c640879c9c06-isvc-sklearn-scale-raw-63746-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:33:15.571593 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:15.571557 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" event={"ID":"6112d18d-543e-4963-a027-c640879c9c06","Type":"ContainerDied","Data":"d6ef28f12673feddeb2f89df7e10fcfadc4e18cb051c9eeb07f9a6c253885a05"} Apr 24 21:33:15.572007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:15.571609 2569 scope.go:117] "RemoveContainer" containerID="f5e3dedf3265888ee2dfc5af772a5f46dbbd7e4bfd80a468b09b601a4a1758fb" Apr 24 21:33:15.572007 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:15.571622 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg" Apr 24 21:33:15.580440 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:15.580422 2569 scope.go:117] "RemoveContainer" containerID="452d0301f4f871a763b95896398d5c03b7b8b3113c4b84e5ed7f9a8c387f1d5f" Apr 24 21:33:15.587078 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:15.587061 2569 scope.go:117] "RemoveContainer" containerID="c5a9b2a7c4261cfaafa5778defa496167f72d2f34d2c70ae2d020e0ebb8ea847" Apr 24 21:33:15.594788 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:15.594768 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg"] Apr 24 21:33:15.598277 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:15.598257 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-63746-predictor-5769d66c6b-z87xg"] Apr 24 21:33:16.437366 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:16.437330 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6112d18d-543e-4963-a027-c640879c9c06" path="/var/lib/kubelet/pods/6112d18d-543e-4963-a027-c640879c9c06/volumes" Apr 24 21:33:19.572034 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:19.572007 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:33:19.572632 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:19.572607 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:33:29.572797 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:29.572737 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:33:39.573406 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:39.573368 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:33:49.572693 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:49.572656 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:33:59.573396 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:33:59.573354 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:34:09.572894 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:09.572850 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:34:19.573900 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:19.573872 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:34:26.486051 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.486011 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt"] Apr 24 21:34:26.486481 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.486339 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="storage-initializer" Apr 24 21:34:26.486481 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.486351 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="storage-initializer" Apr 24 21:34:26.486481 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.486361 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kube-rbac-proxy" Apr 24 21:34:26.486481 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.486366 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kube-rbac-proxy" Apr 24 21:34:26.486481 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.486397 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" Apr 24 21:34:26.486481 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.486402 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" Apr 24 21:34:26.486481 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.486454 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kserve-container" Apr 24 21:34:26.486481 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.486464 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6112d18d-543e-4963-a027-c640879c9c06" containerName="kube-rbac-proxy" Apr 24 21:34:26.492201 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.492089 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.494999 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.494975 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-8d5709-predictor-serving-cert\"" Apr 24 21:34:26.495119 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.495081 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-8d5709-kube-rbac-proxy-sar-config\"" Apr 24 21:34:26.495311 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.495293 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-8d5709\"" Apr 24 21:34:26.495403 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.495360 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 21:34:26.495403 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.495375 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-8d5709-dockercfg-ww572\"" Apr 24 21:34:26.501425 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.501403 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt"] Apr 24 21:34:26.615976 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.615947 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prvzd\" (UniqueName: \"kubernetes.io/projected/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-kube-api-access-prvzd\") pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.616131 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.616007 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-8d5709-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-isvc-secondary-8d5709-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.616131 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.616070 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-proxy-tls\") pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.616131 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.616102 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-kserve-provision-location\") pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.616244 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.616145 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-cabundle-cert\") pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.717046 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.717022 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prvzd\" (UniqueName: \"kubernetes.io/projected/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-kube-api-access-prvzd\") pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.717187 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.717087 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-8d5709-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-isvc-secondary-8d5709-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.717187 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.717123 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-proxy-tls\") pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.717187 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.717152 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-kserve-provision-location\") pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.717354 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.717195 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-cabundle-cert\") pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.717624 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.717598 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-kserve-provision-location\") pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.717788 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.717747 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-8d5709-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-isvc-secondary-8d5709-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.717788 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.717777 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-cabundle-cert\") pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.719493 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.719474 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-proxy-tls\") pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.725679 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.725659 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prvzd\" (UniqueName: \"kubernetes.io/projected/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-kube-api-access-prvzd\") pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.804375 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.804357 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:26.921394 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.921332 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt"] Apr 24 21:34:26.923639 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:34:26.923608 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb2529bd_999e_40c2_9d81_0ea7ce55fa8d.slice/crio-14db91ba508c4555aed4e32dead9ec22b48789edc571b8064cf202e583f263ba WatchSource:0}: Error finding container 14db91ba508c4555aed4e32dead9ec22b48789edc571b8064cf202e583f263ba: Status 404 returned error can't find the container with id 14db91ba508c4555aed4e32dead9ec22b48789edc571b8064cf202e583f263ba Apr 24 21:34:26.925408 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:26.925394 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:34:27.791230 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:27.791196 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" event={"ID":"db2529bd-999e-40c2-9d81-0ea7ce55fa8d","Type":"ContainerStarted","Data":"5f99bf391c7abd842e41b21eb6539976ec15d1fbb06c322d34227b2180f8474c"} Apr 24 21:34:27.791230 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:27.791233 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" event={"ID":"db2529bd-999e-40c2-9d81-0ea7ce55fa8d","Type":"ContainerStarted","Data":"14db91ba508c4555aed4e32dead9ec22b48789edc571b8064cf202e583f263ba"} Apr 24 21:34:33.811251 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:33.811220 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt_db2529bd-999e-40c2-9d81-0ea7ce55fa8d/storage-initializer/0.log" Apr 24 21:34:33.811636 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:33.811261 2569 generic.go:358] "Generic (PLEG): container finished" podID="db2529bd-999e-40c2-9d81-0ea7ce55fa8d" containerID="5f99bf391c7abd842e41b21eb6539976ec15d1fbb06c322d34227b2180f8474c" exitCode=1 Apr 24 21:34:33.811636 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:33.811336 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" event={"ID":"db2529bd-999e-40c2-9d81-0ea7ce55fa8d","Type":"ContainerDied","Data":"5f99bf391c7abd842e41b21eb6539976ec15d1fbb06c322d34227b2180f8474c"} Apr 24 21:34:34.816793 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:34.816751 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt_db2529bd-999e-40c2-9d81-0ea7ce55fa8d/storage-initializer/0.log" Apr 24 21:34:34.817136 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:34.816835 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" event={"ID":"db2529bd-999e-40c2-9d81-0ea7ce55fa8d","Type":"ContainerStarted","Data":"078ed4fabe5ebe2cb0cc5f4278f2b89971709f417c8547daf4e3574ce908cff7"} Apr 24 21:34:38.829793 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:38.829744 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt_db2529bd-999e-40c2-9d81-0ea7ce55fa8d/storage-initializer/1.log" Apr 24 21:34:38.830168 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:38.830125 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt_db2529bd-999e-40c2-9d81-0ea7ce55fa8d/storage-initializer/0.log" Apr 24 21:34:38.830168 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:38.830157 2569 generic.go:358] "Generic (PLEG): container finished" podID="db2529bd-999e-40c2-9d81-0ea7ce55fa8d" containerID="078ed4fabe5ebe2cb0cc5f4278f2b89971709f417c8547daf4e3574ce908cff7" exitCode=1 Apr 24 21:34:38.830258 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:38.830238 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" event={"ID":"db2529bd-999e-40c2-9d81-0ea7ce55fa8d","Type":"ContainerDied","Data":"078ed4fabe5ebe2cb0cc5f4278f2b89971709f417c8547daf4e3574ce908cff7"} Apr 24 21:34:38.830299 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:38.830283 2569 scope.go:117] "RemoveContainer" containerID="5f99bf391c7abd842e41b21eb6539976ec15d1fbb06c322d34227b2180f8474c" Apr 24 21:34:38.830740 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:38.830722 2569 scope.go:117] "RemoveContainer" containerID="5f99bf391c7abd842e41b21eb6539976ec15d1fbb06c322d34227b2180f8474c" Apr 24 21:34:38.840397 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:34:38.840354 2569 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt_kserve-ci-e2e-test_db2529bd-999e-40c2-9d81-0ea7ce55fa8d_0 in pod sandbox 14db91ba508c4555aed4e32dead9ec22b48789edc571b8064cf202e583f263ba from index: no such id: '5f99bf391c7abd842e41b21eb6539976ec15d1fbb06c322d34227b2180f8474c'" containerID="5f99bf391c7abd842e41b21eb6539976ec15d1fbb06c322d34227b2180f8474c" Apr 24 21:34:38.840495 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:34:38.840419 2569 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt_kserve-ci-e2e-test_db2529bd-999e-40c2-9d81-0ea7ce55fa8d_0 in pod sandbox 14db91ba508c4555aed4e32dead9ec22b48789edc571b8064cf202e583f263ba from index: no such id: '5f99bf391c7abd842e41b21eb6539976ec15d1fbb06c322d34227b2180f8474c'; Skipping pod \"isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt_kserve-ci-e2e-test(db2529bd-999e-40c2-9d81-0ea7ce55fa8d)\"" logger="UnhandledError" Apr 24 21:34:38.841741 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:34:38.841717 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt_kserve-ci-e2e-test(db2529bd-999e-40c2-9d81-0ea7ce55fa8d)\"" pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" podUID="db2529bd-999e-40c2-9d81-0ea7ce55fa8d" Apr 24 21:34:39.835409 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:39.835380 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt_db2529bd-999e-40c2-9d81-0ea7ce55fa8d/storage-initializer/1.log" Apr 24 21:34:44.530348 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.530276 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt"] Apr 24 21:34:44.586836 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.586812 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd"] Apr 24 21:34:44.587190 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.587142 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kserve-container" containerID="cri-o://e06d439183a93c884670ec71c5ea5e5b587dbafb5f07d2453edbcbaf6fe89a88" gracePeriod=30 Apr 24 21:34:44.587286 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.587211 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kube-rbac-proxy" containerID="cri-o://4b2feb7cfae72b910dbe8cab1f60084dfa7cbc876ad8dcbae8d4aed0ed59fa0a" gracePeriod=30 Apr 24 21:34:44.679802 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.679772 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf"] Apr 24 21:34:44.684448 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.684427 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:44.686986 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.686962 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-e681b7-kube-rbac-proxy-sar-config\"" Apr 24 21:34:44.687078 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.686996 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-e681b7\"" Apr 24 21:34:44.687078 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.687032 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-e681b7-predictor-serving-cert\"" Apr 24 21:34:44.687078 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.686998 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-e681b7-dockercfg-x25kd\"" Apr 24 21:34:44.692565 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.692396 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf"] Apr 24 21:34:44.706972 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.706954 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt_db2529bd-999e-40c2-9d81-0ea7ce55fa8d/storage-initializer/1.log" Apr 24 21:34:44.707056 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.707022 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:44.761173 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761143 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-proxy-tls\") pod \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " Apr 24 21:34:44.761173 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761177 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-kserve-provision-location\") pod \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " Apr 24 21:34:44.761398 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761234 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-8d5709-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-isvc-secondary-8d5709-kube-rbac-proxy-sar-config\") pod \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " Apr 24 21:34:44.761398 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761270 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-cabundle-cert\") pod \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " Apr 24 21:34:44.761398 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761340 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prvzd\" (UniqueName: \"kubernetes.io/projected/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-kube-api-access-prvzd\") pod \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\" (UID: \"db2529bd-999e-40c2-9d81-0ea7ce55fa8d\") " Apr 24 21:34:44.761549 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761451 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97749758-c530-471e-a18d-cb99b3d0801e-kserve-provision-location\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:44.761549 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761494 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "db2529bd-999e-40c2-9d81-0ea7ce55fa8d" (UID: "db2529bd-999e-40c2-9d81-0ea7ce55fa8d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:34:44.761549 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761526 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/97749758-c530-471e-a18d-cb99b3d0801e-cabundle-cert\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:44.761703 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761608 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kmk6\" (UniqueName: \"kubernetes.io/projected/97749758-c530-471e-a18d-cb99b3d0801e-kube-api-access-9kmk6\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:44.761703 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761634 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-isvc-secondary-8d5709-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-8d5709-kube-rbac-proxy-sar-config") pod "db2529bd-999e-40c2-9d81-0ea7ce55fa8d" (UID: "db2529bd-999e-40c2-9d81-0ea7ce55fa8d"). InnerVolumeSpecName "isvc-secondary-8d5709-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:44.761703 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761649 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97749758-c530-471e-a18d-cb99b3d0801e-proxy-tls\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:44.761703 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761667 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "db2529bd-999e-40c2-9d81-0ea7ce55fa8d" (UID: "db2529bd-999e-40c2-9d81-0ea7ce55fa8d"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:44.761921 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761723 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-e681b7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97749758-c530-471e-a18d-cb99b3d0801e-isvc-init-fail-e681b7-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:44.761921 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761805 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-kserve-provision-location\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:34:44.761921 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761817 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-8d5709-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-isvc-secondary-8d5709-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:34:44.761921 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.761827 2569 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-cabundle-cert\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:34:44.763273 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.763253 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "db2529bd-999e-40c2-9d81-0ea7ce55fa8d" (UID: "db2529bd-999e-40c2-9d81-0ea7ce55fa8d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:34:44.763347 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.763333 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-kube-api-access-prvzd" (OuterVolumeSpecName: "kube-api-access-prvzd") pod "db2529bd-999e-40c2-9d81-0ea7ce55fa8d" (UID: "db2529bd-999e-40c2-9d81-0ea7ce55fa8d"). InnerVolumeSpecName "kube-api-access-prvzd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:34:44.850276 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.850206 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt_db2529bd-999e-40c2-9d81-0ea7ce55fa8d/storage-initializer/1.log" Apr 24 21:34:44.850406 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.850296 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" event={"ID":"db2529bd-999e-40c2-9d81-0ea7ce55fa8d","Type":"ContainerDied","Data":"14db91ba508c4555aed4e32dead9ec22b48789edc571b8064cf202e583f263ba"} Apr 24 21:34:44.850406 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.850328 2569 scope.go:117] "RemoveContainer" containerID="078ed4fabe5ebe2cb0cc5f4278f2b89971709f417c8547daf4e3574ce908cff7" Apr 24 21:34:44.850406 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.850328 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt" Apr 24 21:34:44.852448 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.852420 2569 generic.go:358] "Generic (PLEG): container finished" podID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerID="4b2feb7cfae72b910dbe8cab1f60084dfa7cbc876ad8dcbae8d4aed0ed59fa0a" exitCode=2 Apr 24 21:34:44.852548 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.852482 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" event={"ID":"63d2248f-f7e9-4ed0-b369-0d40a0b04454","Type":"ContainerDied","Data":"4b2feb7cfae72b910dbe8cab1f60084dfa7cbc876ad8dcbae8d4aed0ed59fa0a"} Apr 24 21:34:44.862379 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.862360 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/97749758-c530-471e-a18d-cb99b3d0801e-cabundle-cert\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:44.862477 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.862409 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kmk6\" (UniqueName: \"kubernetes.io/projected/97749758-c530-471e-a18d-cb99b3d0801e-kube-api-access-9kmk6\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:44.862477 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.862442 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97749758-c530-471e-a18d-cb99b3d0801e-proxy-tls\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:44.862477 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.862468 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-e681b7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97749758-c530-471e-a18d-cb99b3d0801e-isvc-init-fail-e681b7-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:44.862583 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.862493 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97749758-c530-471e-a18d-cb99b3d0801e-kserve-provision-location\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:44.862583 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.862536 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-prvzd\" (UniqueName: \"kubernetes.io/projected/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-kube-api-access-prvzd\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:34:44.862583 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.862550 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db2529bd-999e-40c2-9d81-0ea7ce55fa8d-proxy-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:34:44.862714 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:34:44.862591 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-serving-cert: secret "isvc-init-fail-e681b7-predictor-serving-cert" not found Apr 24 21:34:44.862714 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:34:44.862667 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97749758-c530-471e-a18d-cb99b3d0801e-proxy-tls podName:97749758-c530-471e-a18d-cb99b3d0801e nodeName:}" failed. No retries permitted until 2026-04-24 21:34:45.362634826 +0000 UTC m=+1073.497033446 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/97749758-c530-471e-a18d-cb99b3d0801e-proxy-tls") pod "isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" (UID: "97749758-c530-471e-a18d-cb99b3d0801e") : secret "isvc-init-fail-e681b7-predictor-serving-cert" not found Apr 24 21:34:44.862953 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.862938 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97749758-c530-471e-a18d-cb99b3d0801e-kserve-provision-location\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:44.863044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.863028 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/97749758-c530-471e-a18d-cb99b3d0801e-cabundle-cert\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:44.863216 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.863194 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-e681b7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97749758-c530-471e-a18d-cb99b3d0801e-isvc-init-fail-e681b7-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:44.872636 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.872621 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kmk6\" (UniqueName: \"kubernetes.io/projected/97749758-c530-471e-a18d-cb99b3d0801e-kube-api-access-9kmk6\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:44.885828 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.885804 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt"] Apr 24 21:34:44.889615 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:44.889592 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-8d5709-predictor-67d976f9d6-pcqwt"] Apr 24 21:34:45.366635 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:45.366596 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97749758-c530-471e-a18d-cb99b3d0801e-proxy-tls\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:45.368941 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:45.368922 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97749758-c530-471e-a18d-cb99b3d0801e-proxy-tls\") pod \"isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:45.603101 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:45.603067 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:45.723107 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:45.723080 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf"] Apr 24 21:34:45.725051 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:34:45.725025 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97749758_c530_471e_a18d_cb99b3d0801e.slice/crio-8adffc0aff32f5d4b5203c4c72aefaccb603212c66f3d57c61167d54930aeb3c WatchSource:0}: Error finding container 8adffc0aff32f5d4b5203c4c72aefaccb603212c66f3d57c61167d54930aeb3c: Status 404 returned error can't find the container with id 8adffc0aff32f5d4b5203c4c72aefaccb603212c66f3d57c61167d54930aeb3c Apr 24 21:34:45.856684 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:45.856646 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" event={"ID":"97749758-c530-471e-a18d-cb99b3d0801e","Type":"ContainerStarted","Data":"fc2e0fa24731984ec62ad79eb01410d0df9076a9391b17d2516e17b40132a7e1"} Apr 24 21:34:45.856684 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:45.856686 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" event={"ID":"97749758-c530-471e-a18d-cb99b3d0801e","Type":"ContainerStarted","Data":"8adffc0aff32f5d4b5203c4c72aefaccb603212c66f3d57c61167d54930aeb3c"} Apr 24 21:34:46.436616 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:46.436585 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db2529bd-999e-40c2-9d81-0ea7ce55fa8d" path="/var/lib/kubelet/pods/db2529bd-999e-40c2-9d81-0ea7ce55fa8d/volumes" Apr 24 21:34:48.625970 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.625948 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:34:48.694554 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.694488 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-8d5709-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/63d2248f-f7e9-4ed0-b369-0d40a0b04454-isvc-primary-8d5709-kube-rbac-proxy-sar-config\") pod \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " Apr 24 21:34:48.694554 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.694538 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zcmg\" (UniqueName: \"kubernetes.io/projected/63d2248f-f7e9-4ed0-b369-0d40a0b04454-kube-api-access-5zcmg\") pod \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " Apr 24 21:34:48.694746 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.694586 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63d2248f-f7e9-4ed0-b369-0d40a0b04454-kserve-provision-location\") pod \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " Apr 24 21:34:48.694746 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.694616 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63d2248f-f7e9-4ed0-b369-0d40a0b04454-proxy-tls\") pod \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\" (UID: \"63d2248f-f7e9-4ed0-b369-0d40a0b04454\") " Apr 24 21:34:48.694922 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.694900 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63d2248f-f7e9-4ed0-b369-0d40a0b04454-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "63d2248f-f7e9-4ed0-b369-0d40a0b04454" (UID: "63d2248f-f7e9-4ed0-b369-0d40a0b04454"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:34:48.694988 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.694917 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63d2248f-f7e9-4ed0-b369-0d40a0b04454-isvc-primary-8d5709-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-8d5709-kube-rbac-proxy-sar-config") pod "63d2248f-f7e9-4ed0-b369-0d40a0b04454" (UID: "63d2248f-f7e9-4ed0-b369-0d40a0b04454"). InnerVolumeSpecName "isvc-primary-8d5709-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:48.696639 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.696618 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d2248f-f7e9-4ed0-b369-0d40a0b04454-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "63d2248f-f7e9-4ed0-b369-0d40a0b04454" (UID: "63d2248f-f7e9-4ed0-b369-0d40a0b04454"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:34:48.696639 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.696629 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63d2248f-f7e9-4ed0-b369-0d40a0b04454-kube-api-access-5zcmg" (OuterVolumeSpecName: "kube-api-access-5zcmg") pod "63d2248f-f7e9-4ed0-b369-0d40a0b04454" (UID: "63d2248f-f7e9-4ed0-b369-0d40a0b04454"). InnerVolumeSpecName "kube-api-access-5zcmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:34:48.795787 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.795719 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5zcmg\" (UniqueName: \"kubernetes.io/projected/63d2248f-f7e9-4ed0-b369-0d40a0b04454-kube-api-access-5zcmg\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:34:48.795787 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.795751 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63d2248f-f7e9-4ed0-b369-0d40a0b04454-kserve-provision-location\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:34:48.795787 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.795783 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63d2248f-f7e9-4ed0-b369-0d40a0b04454-proxy-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:34:48.795787 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.795793 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-8d5709-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/63d2248f-f7e9-4ed0-b369-0d40a0b04454-isvc-primary-8d5709-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:34:48.874643 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.874612 2569 generic.go:358] "Generic (PLEG): container finished" podID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerID="e06d439183a93c884670ec71c5ea5e5b587dbafb5f07d2453edbcbaf6fe89a88" exitCode=0 Apr 24 21:34:48.874786 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.874686 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" event={"ID":"63d2248f-f7e9-4ed0-b369-0d40a0b04454","Type":"ContainerDied","Data":"e06d439183a93c884670ec71c5ea5e5b587dbafb5f07d2453edbcbaf6fe89a88"} Apr 24 21:34:48.874786 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.874714 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" event={"ID":"63d2248f-f7e9-4ed0-b369-0d40a0b04454","Type":"ContainerDied","Data":"0e6943472526b3a52cf1b569ca73b582644b6404d12b0ccd87b60265a7a56c23"} Apr 24 21:34:48.874786 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.874714 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd" Apr 24 21:34:48.874786 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.874727 2569 scope.go:117] "RemoveContainer" containerID="4b2feb7cfae72b910dbe8cab1f60084dfa7cbc876ad8dcbae8d4aed0ed59fa0a" Apr 24 21:34:48.883528 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.883509 2569 scope.go:117] "RemoveContainer" containerID="e06d439183a93c884670ec71c5ea5e5b587dbafb5f07d2453edbcbaf6fe89a88" Apr 24 21:34:48.890313 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.890295 2569 scope.go:117] "RemoveContainer" containerID="8880e20f4f0f0c85518dc9dee678571645cee637b63e3ff5c638dc49ab090971" Apr 24 21:34:48.895739 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.895719 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd"] Apr 24 21:34:48.897609 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.897589 2569 scope.go:117] "RemoveContainer" containerID="4b2feb7cfae72b910dbe8cab1f60084dfa7cbc876ad8dcbae8d4aed0ed59fa0a" Apr 24 21:34:48.898122 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:34:48.898053 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b2feb7cfae72b910dbe8cab1f60084dfa7cbc876ad8dcbae8d4aed0ed59fa0a\": container with ID starting with 4b2feb7cfae72b910dbe8cab1f60084dfa7cbc876ad8dcbae8d4aed0ed59fa0a not found: ID does not exist" containerID="4b2feb7cfae72b910dbe8cab1f60084dfa7cbc876ad8dcbae8d4aed0ed59fa0a" Apr 24 21:34:48.898364 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.898155 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b2feb7cfae72b910dbe8cab1f60084dfa7cbc876ad8dcbae8d4aed0ed59fa0a"} err="failed to get container status \"4b2feb7cfae72b910dbe8cab1f60084dfa7cbc876ad8dcbae8d4aed0ed59fa0a\": rpc error: code = NotFound desc = could not find container \"4b2feb7cfae72b910dbe8cab1f60084dfa7cbc876ad8dcbae8d4aed0ed59fa0a\": container with ID starting with 4b2feb7cfae72b910dbe8cab1f60084dfa7cbc876ad8dcbae8d4aed0ed59fa0a not found: ID does not exist" Apr 24 21:34:48.898364 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.898207 2569 scope.go:117] "RemoveContainer" containerID="e06d439183a93c884670ec71c5ea5e5b587dbafb5f07d2453edbcbaf6fe89a88" Apr 24 21:34:48.898517 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:34:48.898490 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e06d439183a93c884670ec71c5ea5e5b587dbafb5f07d2453edbcbaf6fe89a88\": container with ID starting with e06d439183a93c884670ec71c5ea5e5b587dbafb5f07d2453edbcbaf6fe89a88 not found: ID does not exist" containerID="e06d439183a93c884670ec71c5ea5e5b587dbafb5f07d2453edbcbaf6fe89a88" Apr 24 21:34:48.898610 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.898525 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06d439183a93c884670ec71c5ea5e5b587dbafb5f07d2453edbcbaf6fe89a88"} err="failed to get container status \"e06d439183a93c884670ec71c5ea5e5b587dbafb5f07d2453edbcbaf6fe89a88\": rpc error: code = NotFound desc = could not find container \"e06d439183a93c884670ec71c5ea5e5b587dbafb5f07d2453edbcbaf6fe89a88\": container with ID starting with e06d439183a93c884670ec71c5ea5e5b587dbafb5f07d2453edbcbaf6fe89a88 not found: ID does not exist" Apr 24 21:34:48.898610 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.898543 2569 scope.go:117] "RemoveContainer" containerID="8880e20f4f0f0c85518dc9dee678571645cee637b63e3ff5c638dc49ab090971" Apr 24 21:34:48.898879 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:34:48.898862 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8880e20f4f0f0c85518dc9dee678571645cee637b63e3ff5c638dc49ab090971\": container with ID starting with 8880e20f4f0f0c85518dc9dee678571645cee637b63e3ff5c638dc49ab090971 not found: ID does not exist" containerID="8880e20f4f0f0c85518dc9dee678571645cee637b63e3ff5c638dc49ab090971" Apr 24 21:34:48.898937 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.898882 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8880e20f4f0f0c85518dc9dee678571645cee637b63e3ff5c638dc49ab090971"} err="failed to get container status \"8880e20f4f0f0c85518dc9dee678571645cee637b63e3ff5c638dc49ab090971\": rpc error: code = NotFound desc = could not find container \"8880e20f4f0f0c85518dc9dee678571645cee637b63e3ff5c638dc49ab090971\": container with ID starting with 8880e20f4f0f0c85518dc9dee678571645cee637b63e3ff5c638dc49ab090971 not found: ID does not exist" Apr 24 21:34:48.900804 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:48.900784 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-8d5709-predictor-59cbfc9977-bkkhd"] Apr 24 21:34:50.436117 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:50.436084 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" path="/var/lib/kubelet/pods/63d2248f-f7e9-4ed0-b369-0d40a0b04454/volumes" Apr 24 21:34:50.882700 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:50.882673 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf_97749758-c530-471e-a18d-cb99b3d0801e/storage-initializer/0.log" Apr 24 21:34:50.882849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:50.882712 2569 generic.go:358] "Generic (PLEG): container finished" podID="97749758-c530-471e-a18d-cb99b3d0801e" containerID="fc2e0fa24731984ec62ad79eb01410d0df9076a9391b17d2516e17b40132a7e1" exitCode=1 Apr 24 21:34:50.882849 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:50.882796 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" event={"ID":"97749758-c530-471e-a18d-cb99b3d0801e","Type":"ContainerDied","Data":"fc2e0fa24731984ec62ad79eb01410d0df9076a9391b17d2516e17b40132a7e1"} Apr 24 21:34:51.887615 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:51.887590 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf_97749758-c530-471e-a18d-cb99b3d0801e/storage-initializer/0.log" Apr 24 21:34:51.887988 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:51.887709 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" event={"ID":"97749758-c530-471e-a18d-cb99b3d0801e","Type":"ContainerStarted","Data":"f1517b53da10c1bd0bc0017c71cd307e0d599dfbba0a0ae606b249c28d5e66d4"} Apr 24 21:34:54.665084 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.665040 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf"] Apr 24 21:34:54.665495 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.665351 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" podUID="97749758-c530-471e-a18d-cb99b3d0801e" containerName="storage-initializer" containerID="cri-o://f1517b53da10c1bd0bc0017c71cd307e0d599dfbba0a0ae606b249c28d5e66d4" gracePeriod=30 Apr 24 21:34:54.805228 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.805190 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4"] Apr 24 21:34:54.805717 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.805696 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db2529bd-999e-40c2-9d81-0ea7ce55fa8d" containerName="storage-initializer" Apr 24 21:34:54.805717 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.805716 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="db2529bd-999e-40c2-9d81-0ea7ce55fa8d" containerName="storage-initializer" Apr 24 21:34:54.805902 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.805742 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="storage-initializer" Apr 24 21:34:54.805902 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.805752 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="storage-initializer" Apr 24 21:34:54.805902 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.805787 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kube-rbac-proxy" Apr 24 21:34:54.805902 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.805796 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kube-rbac-proxy" Apr 24 21:34:54.805902 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.805817 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kserve-container" Apr 24 21:34:54.805902 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.805827 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kserve-container" Apr 24 21:34:54.805902 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.805893 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="db2529bd-999e-40c2-9d81-0ea7ce55fa8d" containerName="storage-initializer" Apr 24 21:34:54.806127 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.805908 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kube-rbac-proxy" Apr 24 21:34:54.806127 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.805920 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="63d2248f-f7e9-4ed0-b369-0d40a0b04454" containerName="kserve-container" Apr 24 21:34:54.806127 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.806013 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db2529bd-999e-40c2-9d81-0ea7ce55fa8d" containerName="storage-initializer" Apr 24 21:34:54.806127 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.806022 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="db2529bd-999e-40c2-9d81-0ea7ce55fa8d" containerName="storage-initializer" Apr 24 21:34:54.806127 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.806093 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="db2529bd-999e-40c2-9d81-0ea7ce55fa8d" containerName="storage-initializer" Apr 24 21:34:54.810732 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.810713 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:34:54.813306 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.813280 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-d6223-predictor-serving-cert\"" Apr 24 21:34:54.813306 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.813303 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4gsmf\"" Apr 24 21:34:54.813501 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.813319 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-d6223-kube-rbac-proxy-sar-config\"" Apr 24 21:34:54.818710 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.818686 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4"] Apr 24 21:34:54.847347 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.847316 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-d6223-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-raw-sklearn-d6223-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-d6223-predictor-7ccc8855db-kvsx4\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:34:54.847473 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.847378 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-kserve-provision-location\") pod \"raw-sklearn-d6223-predictor-7ccc8855db-kvsx4\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:34:54.847473 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.847423 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp9rx\" (UniqueName: \"kubernetes.io/projected/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-kube-api-access-gp9rx\") pod \"raw-sklearn-d6223-predictor-7ccc8855db-kvsx4\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:34:54.847552 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.847492 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-proxy-tls\") pod \"raw-sklearn-d6223-predictor-7ccc8855db-kvsx4\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:34:54.948427 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.948347 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-d6223-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-raw-sklearn-d6223-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-d6223-predictor-7ccc8855db-kvsx4\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:34:54.948427 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.948413 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-kserve-provision-location\") pod \"raw-sklearn-d6223-predictor-7ccc8855db-kvsx4\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:34:54.948638 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.948433 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gp9rx\" (UniqueName: \"kubernetes.io/projected/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-kube-api-access-gp9rx\") pod \"raw-sklearn-d6223-predictor-7ccc8855db-kvsx4\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:34:54.948638 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.948462 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-proxy-tls\") pod \"raw-sklearn-d6223-predictor-7ccc8855db-kvsx4\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:34:54.948900 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.948869 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-kserve-provision-location\") pod \"raw-sklearn-d6223-predictor-7ccc8855db-kvsx4\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:34:54.949027 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.949009 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-d6223-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-raw-sklearn-d6223-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-d6223-predictor-7ccc8855db-kvsx4\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:34:54.950878 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.950862 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-proxy-tls\") pod \"raw-sklearn-d6223-predictor-7ccc8855db-kvsx4\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:34:54.956932 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:54.956908 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp9rx\" (UniqueName: \"kubernetes.io/projected/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-kube-api-access-gp9rx\") pod \"raw-sklearn-d6223-predictor-7ccc8855db-kvsx4\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:34:55.122413 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.122369 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:34:55.248838 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.248809 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4"] Apr 24 21:34:55.251330 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:34:55.251300 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8e70f70_f47d_48b8_88d6_5d064bbde0d1.slice/crio-a5a58af1057f8ba58879e6cb186665b0f07e0cb6481b2f4af7eeb7ba30004a1f WatchSource:0}: Error finding container a5a58af1057f8ba58879e6cb186665b0f07e0cb6481b2f4af7eeb7ba30004a1f: Status 404 returned error can't find the container with id a5a58af1057f8ba58879e6cb186665b0f07e0cb6481b2f4af7eeb7ba30004a1f Apr 24 21:34:55.298034 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.298014 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf_97749758-c530-471e-a18d-cb99b3d0801e/storage-initializer/1.log" Apr 24 21:34:55.298432 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.298409 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf_97749758-c530-471e-a18d-cb99b3d0801e/storage-initializer/0.log" Apr 24 21:34:55.298507 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.298480 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:55.352008 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.351989 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97749758-c530-471e-a18d-cb99b3d0801e-proxy-tls\") pod \"97749758-c530-471e-a18d-cb99b3d0801e\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " Apr 24 21:34:55.352103 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.352018 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/97749758-c530-471e-a18d-cb99b3d0801e-cabundle-cert\") pod \"97749758-c530-471e-a18d-cb99b3d0801e\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " Apr 24 21:34:55.352103 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.352060 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kmk6\" (UniqueName: \"kubernetes.io/projected/97749758-c530-471e-a18d-cb99b3d0801e-kube-api-access-9kmk6\") pod \"97749758-c530-471e-a18d-cb99b3d0801e\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " Apr 24 21:34:55.352103 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.352090 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97749758-c530-471e-a18d-cb99b3d0801e-kserve-provision-location\") pod \"97749758-c530-471e-a18d-cb99b3d0801e\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " Apr 24 21:34:55.352267 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.352129 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-e681b7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97749758-c530-471e-a18d-cb99b3d0801e-isvc-init-fail-e681b7-kube-rbac-proxy-sar-config\") pod \"97749758-c530-471e-a18d-cb99b3d0801e\" (UID: \"97749758-c530-471e-a18d-cb99b3d0801e\") " Apr 24 21:34:55.352472 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.352400 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97749758-c530-471e-a18d-cb99b3d0801e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "97749758-c530-471e-a18d-cb99b3d0801e" (UID: "97749758-c530-471e-a18d-cb99b3d0801e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:34:55.352472 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.352448 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97749758-c530-471e-a18d-cb99b3d0801e-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "97749758-c530-471e-a18d-cb99b3d0801e" (UID: "97749758-c530-471e-a18d-cb99b3d0801e"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:55.352629 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.352537 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97749758-c530-471e-a18d-cb99b3d0801e-isvc-init-fail-e681b7-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-e681b7-kube-rbac-proxy-sar-config") pod "97749758-c530-471e-a18d-cb99b3d0801e" (UID: "97749758-c530-471e-a18d-cb99b3d0801e"). InnerVolumeSpecName "isvc-init-fail-e681b7-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:55.353921 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.353900 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97749758-c530-471e-a18d-cb99b3d0801e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "97749758-c530-471e-a18d-cb99b3d0801e" (UID: "97749758-c530-471e-a18d-cb99b3d0801e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:34:55.354041 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.354022 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97749758-c530-471e-a18d-cb99b3d0801e-kube-api-access-9kmk6" (OuterVolumeSpecName: "kube-api-access-9kmk6") pod "97749758-c530-471e-a18d-cb99b3d0801e" (UID: "97749758-c530-471e-a18d-cb99b3d0801e"). InnerVolumeSpecName "kube-api-access-9kmk6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:34:55.453300 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.453272 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-e681b7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97749758-c530-471e-a18d-cb99b3d0801e-isvc-init-fail-e681b7-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:34:55.453300 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.453295 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97749758-c530-471e-a18d-cb99b3d0801e-proxy-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:34:55.453300 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.453304 2569 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/97749758-c530-471e-a18d-cb99b3d0801e-cabundle-cert\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:34:55.453488 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.453313 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9kmk6\" (UniqueName: \"kubernetes.io/projected/97749758-c530-471e-a18d-cb99b3d0801e-kube-api-access-9kmk6\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:34:55.453488 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.453323 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97749758-c530-471e-a18d-cb99b3d0801e-kserve-provision-location\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:34:55.900784 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.900739 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" event={"ID":"a8e70f70-f47d-48b8-88d6-5d064bbde0d1","Type":"ContainerStarted","Data":"40d2bbc5036fecea646aa44aff88e1e754600c23fa6b1aa1b75314e9f780026b"} Apr 24 21:34:55.900784 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.900789 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" event={"ID":"a8e70f70-f47d-48b8-88d6-5d064bbde0d1","Type":"ContainerStarted","Data":"a5a58af1057f8ba58879e6cb186665b0f07e0cb6481b2f4af7eeb7ba30004a1f"} Apr 24 21:34:55.901854 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.901835 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf_97749758-c530-471e-a18d-cb99b3d0801e/storage-initializer/1.log" Apr 24 21:34:55.902189 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.902175 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf_97749758-c530-471e-a18d-cb99b3d0801e/storage-initializer/0.log" Apr 24 21:34:55.902242 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.902207 2569 generic.go:358] "Generic (PLEG): container finished" podID="97749758-c530-471e-a18d-cb99b3d0801e" containerID="f1517b53da10c1bd0bc0017c71cd307e0d599dfbba0a0ae606b249c28d5e66d4" exitCode=1 Apr 24 21:34:55.902297 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.902270 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" event={"ID":"97749758-c530-471e-a18d-cb99b3d0801e","Type":"ContainerDied","Data":"f1517b53da10c1bd0bc0017c71cd307e0d599dfbba0a0ae606b249c28d5e66d4"} Apr 24 21:34:55.902297 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.902280 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" Apr 24 21:34:55.902297 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.902288 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf" event={"ID":"97749758-c530-471e-a18d-cb99b3d0801e","Type":"ContainerDied","Data":"8adffc0aff32f5d4b5203c4c72aefaccb603212c66f3d57c61167d54930aeb3c"} Apr 24 21:34:55.902399 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.902302 2569 scope.go:117] "RemoveContainer" containerID="f1517b53da10c1bd0bc0017c71cd307e0d599dfbba0a0ae606b249c28d5e66d4" Apr 24 21:34:55.910315 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.910297 2569 scope.go:117] "RemoveContainer" containerID="fc2e0fa24731984ec62ad79eb01410d0df9076a9391b17d2516e17b40132a7e1" Apr 24 21:34:55.916881 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.916867 2569 scope.go:117] "RemoveContainer" containerID="f1517b53da10c1bd0bc0017c71cd307e0d599dfbba0a0ae606b249c28d5e66d4" Apr 24 21:34:55.917077 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:34:55.917062 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1517b53da10c1bd0bc0017c71cd307e0d599dfbba0a0ae606b249c28d5e66d4\": container with ID starting with f1517b53da10c1bd0bc0017c71cd307e0d599dfbba0a0ae606b249c28d5e66d4 not found: ID does not exist" containerID="f1517b53da10c1bd0bc0017c71cd307e0d599dfbba0a0ae606b249c28d5e66d4" Apr 24 21:34:55.917127 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.917084 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1517b53da10c1bd0bc0017c71cd307e0d599dfbba0a0ae606b249c28d5e66d4"} err="failed to get container status \"f1517b53da10c1bd0bc0017c71cd307e0d599dfbba0a0ae606b249c28d5e66d4\": rpc error: code = NotFound desc = could not find container \"f1517b53da10c1bd0bc0017c71cd307e0d599dfbba0a0ae606b249c28d5e66d4\": container with ID starting with f1517b53da10c1bd0bc0017c71cd307e0d599dfbba0a0ae606b249c28d5e66d4 not found: ID does not exist" Apr 24 21:34:55.917127 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.917101 2569 scope.go:117] "RemoveContainer" containerID="fc2e0fa24731984ec62ad79eb01410d0df9076a9391b17d2516e17b40132a7e1" Apr 24 21:34:55.917337 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:34:55.917317 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc2e0fa24731984ec62ad79eb01410d0df9076a9391b17d2516e17b40132a7e1\": container with ID starting with fc2e0fa24731984ec62ad79eb01410d0df9076a9391b17d2516e17b40132a7e1 not found: ID does not exist" containerID="fc2e0fa24731984ec62ad79eb01410d0df9076a9391b17d2516e17b40132a7e1" Apr 24 21:34:55.917374 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.917346 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc2e0fa24731984ec62ad79eb01410d0df9076a9391b17d2516e17b40132a7e1"} err="failed to get container status \"fc2e0fa24731984ec62ad79eb01410d0df9076a9391b17d2516e17b40132a7e1\": rpc error: code = NotFound desc = could not find container \"fc2e0fa24731984ec62ad79eb01410d0df9076a9391b17d2516e17b40132a7e1\": container with ID starting with fc2e0fa24731984ec62ad79eb01410d0df9076a9391b17d2516e17b40132a7e1 not found: ID does not exist" Apr 24 21:34:55.944941 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.944718 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf"] Apr 24 21:34:55.946307 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:55.946283 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e681b7-predictor-57cb65b9d8-nljmf"] Apr 24 21:34:56.436652 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:56.436619 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97749758-c530-471e-a18d-cb99b3d0801e" path="/var/lib/kubelet/pods/97749758-c530-471e-a18d-cb99b3d0801e/volumes" Apr 24 21:34:59.921883 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:59.921846 2569 generic.go:358] "Generic (PLEG): container finished" podID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerID="40d2bbc5036fecea646aa44aff88e1e754600c23fa6b1aa1b75314e9f780026b" exitCode=0 Apr 24 21:34:59.922288 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:34:59.921918 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" event={"ID":"a8e70f70-f47d-48b8-88d6-5d064bbde0d1","Type":"ContainerDied","Data":"40d2bbc5036fecea646aa44aff88e1e754600c23fa6b1aa1b75314e9f780026b"} Apr 24 21:35:00.926161 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:35:00.926128 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" event={"ID":"a8e70f70-f47d-48b8-88d6-5d064bbde0d1","Type":"ContainerStarted","Data":"6fd04eeee01da8f03950d046d7ba02161a12cf48207ff589e9b5954422dbbc36"} Apr 24 21:35:00.926621 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:35:00.926172 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" event={"ID":"a8e70f70-f47d-48b8-88d6-5d064bbde0d1","Type":"ContainerStarted","Data":"6d259144f7562241c4164223dbc854997311631d78cacb936f76556f25251ad5"} Apr 24 21:35:00.926621 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:35:00.926455 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:35:00.926621 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:35:00.926467 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:35:00.927828 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:35:00.927810 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:35:00.945291 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:35:00.945249 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" podStartSLOduration=6.945235195 podStartE2EDuration="6.945235195s" podCreationTimestamp="2026-04-24 21:34:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:35:00.943324735 +0000 UTC m=+1089.077723357" watchObservedRunningTime="2026-04-24 21:35:00.945235195 +0000 UTC m=+1089.079633816" Apr 24 21:35:01.929348 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:35:01.929307 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:35:06.934945 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:35:06.934916 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:35:06.935496 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:35:06.935471 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:35:16.936408 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:35:16.936366 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:35:26.936428 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:35:26.936379 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:35:36.935436 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:35:36.935394 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:35:46.935416 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:35:46.935377 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:35:56.935994 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:35:56.935948 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:36:06.936458 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:06.936427 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:36:14.905832 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:14.905792 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4"] Apr 24 21:36:14.906454 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:14.906214 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kserve-container" containerID="cri-o://6d259144f7562241c4164223dbc854997311631d78cacb936f76556f25251ad5" gracePeriod=30 Apr 24 21:36:14.906454 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:14.906252 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kube-rbac-proxy" containerID="cri-o://6fd04eeee01da8f03950d046d7ba02161a12cf48207ff589e9b5954422dbbc36" gracePeriod=30 Apr 24 21:36:14.993287 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:14.993260 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff"] Apr 24 21:36:14.993636 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:14.993618 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97749758-c530-471e-a18d-cb99b3d0801e" containerName="storage-initializer" Apr 24 21:36:14.993636 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:14.993633 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="97749758-c530-471e-a18d-cb99b3d0801e" containerName="storage-initializer" Apr 24 21:36:14.993735 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:14.993693 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="97749758-c530-471e-a18d-cb99b3d0801e" containerName="storage-initializer" Apr 24 21:36:14.993735 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:14.993703 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="97749758-c530-471e-a18d-cb99b3d0801e" containerName="storage-initializer" Apr 24 21:36:14.993852 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:14.993750 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97749758-c530-471e-a18d-cb99b3d0801e" containerName="storage-initializer" Apr 24 21:36:14.993852 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:14.993776 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="97749758-c530-471e-a18d-cb99b3d0801e" containerName="storage-initializer" Apr 24 21:36:14.998152 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:14.998122 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:15.000452 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.000430 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-8f851-predictor-serving-cert\"" Apr 24 21:36:15.000541 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.000436 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-8f851-kube-rbac-proxy-sar-config\"" Apr 24 21:36:15.007246 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.007224 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff"] Apr 24 21:36:15.071724 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.071702 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrjh5\" (UniqueName: \"kubernetes.io/projected/8547082c-6adf-40d2-8dce-e15eb63349b2-kube-api-access-mrjh5\") pod \"raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:15.071846 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.071733 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8547082c-6adf-40d2-8dce-e15eb63349b2-kserve-provision-location\") pod \"raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:15.071846 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.071811 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-runtime-8f851-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8547082c-6adf-40d2-8dce-e15eb63349b2-raw-sklearn-runtime-8f851-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:15.071924 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.071900 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8547082c-6adf-40d2-8dce-e15eb63349b2-proxy-tls\") pod \"raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:15.144213 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.144178 2569 generic.go:358] "Generic (PLEG): container finished" podID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerID="6fd04eeee01da8f03950d046d7ba02161a12cf48207ff589e9b5954422dbbc36" exitCode=2 Apr 24 21:36:15.144333 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.144250 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" event={"ID":"a8e70f70-f47d-48b8-88d6-5d064bbde0d1","Type":"ContainerDied","Data":"6fd04eeee01da8f03950d046d7ba02161a12cf48207ff589e9b5954422dbbc36"} Apr 24 21:36:15.172968 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.172915 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-runtime-8f851-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8547082c-6adf-40d2-8dce-e15eb63349b2-raw-sklearn-runtime-8f851-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:15.172968 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.172961 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8547082c-6adf-40d2-8dce-e15eb63349b2-proxy-tls\") pod \"raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:15.173100 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.172992 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrjh5\" (UniqueName: \"kubernetes.io/projected/8547082c-6adf-40d2-8dce-e15eb63349b2-kube-api-access-mrjh5\") pod \"raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:15.173100 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.173012 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8547082c-6adf-40d2-8dce-e15eb63349b2-kserve-provision-location\") pod \"raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:15.173100 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:36:15.173093 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-serving-cert: secret "raw-sklearn-runtime-8f851-predictor-serving-cert" not found Apr 24 21:36:15.173252 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:36:15.173160 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8547082c-6adf-40d2-8dce-e15eb63349b2-proxy-tls podName:8547082c-6adf-40d2-8dce-e15eb63349b2 nodeName:}" failed. No retries permitted until 2026-04-24 21:36:15.673138489 +0000 UTC m=+1163.807537094 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8547082c-6adf-40d2-8dce-e15eb63349b2-proxy-tls") pod "raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" (UID: "8547082c-6adf-40d2-8dce-e15eb63349b2") : secret "raw-sklearn-runtime-8f851-predictor-serving-cert" not found Apr 24 21:36:15.173346 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.173331 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8547082c-6adf-40d2-8dce-e15eb63349b2-kserve-provision-location\") pod \"raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:15.173654 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.173636 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-runtime-8f851-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8547082c-6adf-40d2-8dce-e15eb63349b2-raw-sklearn-runtime-8f851-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:15.183367 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.183344 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrjh5\" (UniqueName: \"kubernetes.io/projected/8547082c-6adf-40d2-8dce-e15eb63349b2-kube-api-access-mrjh5\") pod \"raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:15.675601 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.675569 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8547082c-6adf-40d2-8dce-e15eb63349b2-proxy-tls\") pod \"raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:15.677989 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.677957 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8547082c-6adf-40d2-8dce-e15eb63349b2-proxy-tls\") pod \"raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:15.908946 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:15.908913 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:16.024435 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:16.024413 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff"] Apr 24 21:36:16.026615 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:36:16.026578 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8547082c_6adf_40d2_8dce_e15eb63349b2.slice/crio-007acdea28f7d6364b1ffc4e122360def7d6fba375ad3138a865f7434b07604b WatchSource:0}: Error finding container 007acdea28f7d6364b1ffc4e122360def7d6fba375ad3138a865f7434b07604b: Status 404 returned error can't find the container with id 007acdea28f7d6364b1ffc4e122360def7d6fba375ad3138a865f7434b07604b Apr 24 21:36:16.148780 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:16.148717 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" event={"ID":"8547082c-6adf-40d2-8dce-e15eb63349b2","Type":"ContainerStarted","Data":"4fc7f706e72ed8b03fe418da15a64a615c97e7fb4d6d97227d40256d87cf2754"} Apr 24 21:36:16.148780 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:16.148780 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" event={"ID":"8547082c-6adf-40d2-8dce-e15eb63349b2","Type":"ContainerStarted","Data":"007acdea28f7d6364b1ffc4e122360def7d6fba375ad3138a865f7434b07604b"} Apr 24 21:36:16.930225 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:16.930183 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 24 21:36:16.935844 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:16.935818 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:36:19.043417 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.043390 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:36:19.102888 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.102827 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-proxy-tls\") pod \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " Apr 24 21:36:19.102888 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.102873 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-d6223-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-raw-sklearn-d6223-kube-rbac-proxy-sar-config\") pod \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " Apr 24 21:36:19.103028 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.102940 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-kserve-provision-location\") pod \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " Apr 24 21:36:19.103028 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.102984 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp9rx\" (UniqueName: \"kubernetes.io/projected/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-kube-api-access-gp9rx\") pod \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\" (UID: \"a8e70f70-f47d-48b8-88d6-5d064bbde0d1\") " Apr 24 21:36:19.103266 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.103242 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-raw-sklearn-d6223-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-d6223-kube-rbac-proxy-sar-config") pod "a8e70f70-f47d-48b8-88d6-5d064bbde0d1" (UID: "a8e70f70-f47d-48b8-88d6-5d064bbde0d1"). InnerVolumeSpecName "raw-sklearn-d6223-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:36:19.103330 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.103284 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a8e70f70-f47d-48b8-88d6-5d064bbde0d1" (UID: "a8e70f70-f47d-48b8-88d6-5d064bbde0d1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:36:19.105075 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.105051 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-kube-api-access-gp9rx" (OuterVolumeSpecName: "kube-api-access-gp9rx") pod "a8e70f70-f47d-48b8-88d6-5d064bbde0d1" (UID: "a8e70f70-f47d-48b8-88d6-5d064bbde0d1"). InnerVolumeSpecName "kube-api-access-gp9rx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:36:19.105075 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.105059 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a8e70f70-f47d-48b8-88d6-5d064bbde0d1" (UID: "a8e70f70-f47d-48b8-88d6-5d064bbde0d1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:36:19.160551 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.160525 2569 generic.go:358] "Generic (PLEG): container finished" podID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerID="6d259144f7562241c4164223dbc854997311631d78cacb936f76556f25251ad5" exitCode=0 Apr 24 21:36:19.160650 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.160587 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" event={"ID":"a8e70f70-f47d-48b8-88d6-5d064bbde0d1","Type":"ContainerDied","Data":"6d259144f7562241c4164223dbc854997311631d78cacb936f76556f25251ad5"} Apr 24 21:36:19.160650 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.160604 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" Apr 24 21:36:19.160650 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.160614 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4" event={"ID":"a8e70f70-f47d-48b8-88d6-5d064bbde0d1","Type":"ContainerDied","Data":"a5a58af1057f8ba58879e6cb186665b0f07e0cb6481b2f4af7eeb7ba30004a1f"} Apr 24 21:36:19.160650 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.160629 2569 scope.go:117] "RemoveContainer" containerID="6fd04eeee01da8f03950d046d7ba02161a12cf48207ff589e9b5954422dbbc36" Apr 24 21:36:19.168800 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.168781 2569 scope.go:117] "RemoveContainer" containerID="6d259144f7562241c4164223dbc854997311631d78cacb936f76556f25251ad5" Apr 24 21:36:19.177370 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.177355 2569 scope.go:117] "RemoveContainer" containerID="40d2bbc5036fecea646aa44aff88e1e754600c23fa6b1aa1b75314e9f780026b" Apr 24 21:36:19.184481 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.184456 2569 scope.go:117] "RemoveContainer" containerID="6fd04eeee01da8f03950d046d7ba02161a12cf48207ff589e9b5954422dbbc36" Apr 24 21:36:19.184866 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:36:19.184733 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd04eeee01da8f03950d046d7ba02161a12cf48207ff589e9b5954422dbbc36\": container with ID starting with 6fd04eeee01da8f03950d046d7ba02161a12cf48207ff589e9b5954422dbbc36 not found: ID does not exist" containerID="6fd04eeee01da8f03950d046d7ba02161a12cf48207ff589e9b5954422dbbc36" Apr 24 21:36:19.184866 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.184781 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd04eeee01da8f03950d046d7ba02161a12cf48207ff589e9b5954422dbbc36"} err="failed to get container status \"6fd04eeee01da8f03950d046d7ba02161a12cf48207ff589e9b5954422dbbc36\": rpc error: code = NotFound desc = could not find container \"6fd04eeee01da8f03950d046d7ba02161a12cf48207ff589e9b5954422dbbc36\": container with ID starting with 6fd04eeee01da8f03950d046d7ba02161a12cf48207ff589e9b5954422dbbc36 not found: ID does not exist" Apr 24 21:36:19.184866 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.184802 2569 scope.go:117] "RemoveContainer" containerID="6d259144f7562241c4164223dbc854997311631d78cacb936f76556f25251ad5" Apr 24 21:36:19.185149 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:36:19.185030 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d259144f7562241c4164223dbc854997311631d78cacb936f76556f25251ad5\": container with ID starting with 6d259144f7562241c4164223dbc854997311631d78cacb936f76556f25251ad5 not found: ID does not exist" containerID="6d259144f7562241c4164223dbc854997311631d78cacb936f76556f25251ad5" Apr 24 21:36:19.185149 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.185054 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d259144f7562241c4164223dbc854997311631d78cacb936f76556f25251ad5"} err="failed to get container status \"6d259144f7562241c4164223dbc854997311631d78cacb936f76556f25251ad5\": rpc error: code = NotFound desc = could not find container \"6d259144f7562241c4164223dbc854997311631d78cacb936f76556f25251ad5\": container with ID starting with 6d259144f7562241c4164223dbc854997311631d78cacb936f76556f25251ad5 not found: ID does not exist" Apr 24 21:36:19.185149 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.185074 2569 scope.go:117] "RemoveContainer" containerID="40d2bbc5036fecea646aa44aff88e1e754600c23fa6b1aa1b75314e9f780026b" Apr 24 21:36:19.185342 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:36:19.185322 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40d2bbc5036fecea646aa44aff88e1e754600c23fa6b1aa1b75314e9f780026b\": container with ID starting with 40d2bbc5036fecea646aa44aff88e1e754600c23fa6b1aa1b75314e9f780026b not found: ID does not exist" containerID="40d2bbc5036fecea646aa44aff88e1e754600c23fa6b1aa1b75314e9f780026b" Apr 24 21:36:19.185380 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.185347 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d2bbc5036fecea646aa44aff88e1e754600c23fa6b1aa1b75314e9f780026b"} err="failed to get container status \"40d2bbc5036fecea646aa44aff88e1e754600c23fa6b1aa1b75314e9f780026b\": rpc error: code = NotFound desc = could not find container \"40d2bbc5036fecea646aa44aff88e1e754600c23fa6b1aa1b75314e9f780026b\": container with ID starting with 40d2bbc5036fecea646aa44aff88e1e754600c23fa6b1aa1b75314e9f780026b not found: ID does not exist" Apr 24 21:36:19.186500 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.186483 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4"] Apr 24 21:36:19.189439 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.189418 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-d6223-predictor-7ccc8855db-kvsx4"] Apr 24 21:36:19.203486 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.203466 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-kserve-provision-location\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:36:19.203486 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.203488 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gp9rx\" (UniqueName: \"kubernetes.io/projected/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-kube-api-access-gp9rx\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:36:19.203608 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.203497 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-proxy-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:36:19.203608 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:19.203507 2569 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-d6223-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8e70f70-f47d-48b8-88d6-5d064bbde0d1-raw-sklearn-d6223-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:36:20.165556 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:20.165518 2569 generic.go:358] "Generic (PLEG): container finished" podID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerID="4fc7f706e72ed8b03fe418da15a64a615c97e7fb4d6d97227d40256d87cf2754" exitCode=0 Apr 24 21:36:20.165556 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:20.165556 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" event={"ID":"8547082c-6adf-40d2-8dce-e15eb63349b2","Type":"ContainerDied","Data":"4fc7f706e72ed8b03fe418da15a64a615c97e7fb4d6d97227d40256d87cf2754"} Apr 24 21:36:20.436782 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:20.436693 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" path="/var/lib/kubelet/pods/a8e70f70-f47d-48b8-88d6-5d064bbde0d1/volumes" Apr 24 21:36:21.170111 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:21.170078 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" event={"ID":"8547082c-6adf-40d2-8dce-e15eb63349b2","Type":"ContainerStarted","Data":"97030b767ebaf46554eb4c5058330a844dc07ccf59ee112ad303239eacc51d91"} Apr 24 21:36:21.170111 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:21.170117 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" event={"ID":"8547082c-6adf-40d2-8dce-e15eb63349b2","Type":"ContainerStarted","Data":"22a34f0365efdb6084cfc7b9edfe0da3e14fca80cda366fe74d4e203de940c36"} Apr 24 21:36:21.170556 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:21.170397 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:21.170556 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:21.170505 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:21.171641 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:21.171616 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:36:21.190442 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:21.190402 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" podStartSLOduration=7.190388679 podStartE2EDuration="7.190388679s" podCreationTimestamp="2026-04-24 21:36:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:36:21.189038535 +0000 UTC m=+1169.323437155" watchObservedRunningTime="2026-04-24 21:36:21.190388679 +0000 UTC m=+1169.324787300" Apr 24 21:36:22.173460 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:22.173417 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:36:27.178005 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:27.177975 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:36:27.178514 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:27.178489 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:36:37.178445 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:37.178399 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:36:47.179496 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:47.179454 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:36:57.178609 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:36:57.178571 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:37:07.178771 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:07.178712 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:37:17.179330 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:17.179288 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:37:27.179488 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:27.179462 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:37:35.120926 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.120896 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff"] Apr 24 21:37:35.121287 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.121190 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kserve-container" containerID="cri-o://22a34f0365efdb6084cfc7b9edfe0da3e14fca80cda366fe74d4e203de940c36" gracePeriod=30 Apr 24 21:37:35.121287 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.121223 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kube-rbac-proxy" containerID="cri-o://97030b767ebaf46554eb4c5058330a844dc07ccf59ee112ad303239eacc51d91" gracePeriod=30 Apr 24 21:37:35.386659 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.386582 2569 generic.go:358] "Generic (PLEG): container finished" podID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerID="97030b767ebaf46554eb4c5058330a844dc07ccf59ee112ad303239eacc51d91" exitCode=2 Apr 24 21:37:35.386659 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.386618 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" event={"ID":"8547082c-6adf-40d2-8dce-e15eb63349b2","Type":"ContainerDied","Data":"97030b767ebaf46554eb4c5058330a844dc07ccf59ee112ad303239eacc51d91"} Apr 24 21:37:35.814601 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.814570 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-z5qrn/must-gather-6h8g9"] Apr 24 21:37:35.814971 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.814957 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kube-rbac-proxy" Apr 24 21:37:35.815037 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.814972 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kube-rbac-proxy" Apr 24 21:37:35.815037 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.814982 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="storage-initializer" Apr 24 21:37:35.815037 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.814988 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="storage-initializer" Apr 24 21:37:35.815037 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.814998 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kserve-container" Apr 24 21:37:35.815037 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.815004 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kserve-container" Apr 24 21:37:35.815213 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.815060 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kserve-container" Apr 24 21:37:35.815213 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.815069 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8e70f70-f47d-48b8-88d6-5d064bbde0d1" containerName="kube-rbac-proxy" Apr 24 21:37:35.818085 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.818066 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5qrn/must-gather-6h8g9" Apr 24 21:37:35.820893 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.820873 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-z5qrn\"/\"default-dockercfg-lwsnc\"" Apr 24 21:37:35.821992 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.821971 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-z5qrn\"/\"openshift-service-ca.crt\"" Apr 24 21:37:35.822098 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.821999 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-z5qrn\"/\"kube-root-ca.crt\"" Apr 24 21:37:35.829614 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.829596 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-z5qrn/must-gather-6h8g9"] Apr 24 21:37:35.899162 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.899134 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fb09aa5d-1edc-481f-9168-51c7c4a760aa-must-gather-output\") pod \"must-gather-6h8g9\" (UID: \"fb09aa5d-1edc-481f-9168-51c7c4a760aa\") " pod="openshift-must-gather-z5qrn/must-gather-6h8g9" Apr 24 21:37:35.899271 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:35.899184 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ldqg\" (UniqueName: \"kubernetes.io/projected/fb09aa5d-1edc-481f-9168-51c7c4a760aa-kube-api-access-8ldqg\") pod \"must-gather-6h8g9\" (UID: \"fb09aa5d-1edc-481f-9168-51c7c4a760aa\") " pod="openshift-must-gather-z5qrn/must-gather-6h8g9" Apr 24 21:37:36.000531 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:36.000511 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fb09aa5d-1edc-481f-9168-51c7c4a760aa-must-gather-output\") pod \"must-gather-6h8g9\" (UID: \"fb09aa5d-1edc-481f-9168-51c7c4a760aa\") " pod="openshift-must-gather-z5qrn/must-gather-6h8g9" Apr 24 21:37:36.000636 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:36.000553 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ldqg\" (UniqueName: \"kubernetes.io/projected/fb09aa5d-1edc-481f-9168-51c7c4a760aa-kube-api-access-8ldqg\") pod \"must-gather-6h8g9\" (UID: \"fb09aa5d-1edc-481f-9168-51c7c4a760aa\") " pod="openshift-must-gather-z5qrn/must-gather-6h8g9" Apr 24 21:37:36.000853 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:36.000835 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fb09aa5d-1edc-481f-9168-51c7c4a760aa-must-gather-output\") pod \"must-gather-6h8g9\" (UID: \"fb09aa5d-1edc-481f-9168-51c7c4a760aa\") " pod="openshift-must-gather-z5qrn/must-gather-6h8g9" Apr 24 21:37:36.009681 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:36.009664 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ldqg\" (UniqueName: \"kubernetes.io/projected/fb09aa5d-1edc-481f-9168-51c7c4a760aa-kube-api-access-8ldqg\") pod \"must-gather-6h8g9\" (UID: \"fb09aa5d-1edc-481f-9168-51c7c4a760aa\") " pod="openshift-must-gather-z5qrn/must-gather-6h8g9" Apr 24 21:37:36.134222 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:36.134146 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5qrn/must-gather-6h8g9" Apr 24 21:37:36.250692 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:36.250657 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-z5qrn/must-gather-6h8g9"] Apr 24 21:37:36.253070 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:37:36.253042 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb09aa5d_1edc_481f_9168_51c7c4a760aa.slice/crio-acfc8b9a4c004c9e6230c19f8abd9ea3bdf257604c3a0c2d38daaa3302858af7 WatchSource:0}: Error finding container acfc8b9a4c004c9e6230c19f8abd9ea3bdf257604c3a0c2d38daaa3302858af7: Status 404 returned error can't find the container with id acfc8b9a4c004c9e6230c19f8abd9ea3bdf257604c3a0c2d38daaa3302858af7 Apr 24 21:37:36.391202 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:36.391126 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5qrn/must-gather-6h8g9" event={"ID":"fb09aa5d-1edc-481f-9168-51c7c4a760aa","Type":"ContainerStarted","Data":"acfc8b9a4c004c9e6230c19f8abd9ea3bdf257604c3a0c2d38daaa3302858af7"} Apr 24 21:37:37.174273 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:37.174231 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 24 21:37:37.179529 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:37.179487 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:37:39.914725 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:39.914696 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:37:40.036189 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.036154 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8547082c-6adf-40d2-8dce-e15eb63349b2-kserve-provision-location\") pod \"8547082c-6adf-40d2-8dce-e15eb63349b2\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " Apr 24 21:37:40.036372 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.036205 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrjh5\" (UniqueName: \"kubernetes.io/projected/8547082c-6adf-40d2-8dce-e15eb63349b2-kube-api-access-mrjh5\") pod \"8547082c-6adf-40d2-8dce-e15eb63349b2\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " Apr 24 21:37:40.036372 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.036266 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-runtime-8f851-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8547082c-6adf-40d2-8dce-e15eb63349b2-raw-sklearn-runtime-8f851-kube-rbac-proxy-sar-config\") pod \"8547082c-6adf-40d2-8dce-e15eb63349b2\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " Apr 24 21:37:40.036372 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.036323 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8547082c-6adf-40d2-8dce-e15eb63349b2-proxy-tls\") pod \"8547082c-6adf-40d2-8dce-e15eb63349b2\" (UID: \"8547082c-6adf-40d2-8dce-e15eb63349b2\") " Apr 24 21:37:40.036580 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.036474 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8547082c-6adf-40d2-8dce-e15eb63349b2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8547082c-6adf-40d2-8dce-e15eb63349b2" (UID: "8547082c-6adf-40d2-8dce-e15eb63349b2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:40.036639 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.036598 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8547082c-6adf-40d2-8dce-e15eb63349b2-kserve-provision-location\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:37:40.036693 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.036656 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8547082c-6adf-40d2-8dce-e15eb63349b2-raw-sklearn-runtime-8f851-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-runtime-8f851-kube-rbac-proxy-sar-config") pod "8547082c-6adf-40d2-8dce-e15eb63349b2" (UID: "8547082c-6adf-40d2-8dce-e15eb63349b2"). InnerVolumeSpecName "raw-sklearn-runtime-8f851-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:37:40.038822 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.038792 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8547082c-6adf-40d2-8dce-e15eb63349b2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8547082c-6adf-40d2-8dce-e15eb63349b2" (UID: "8547082c-6adf-40d2-8dce-e15eb63349b2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:37:40.038822 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.038796 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8547082c-6adf-40d2-8dce-e15eb63349b2-kube-api-access-mrjh5" (OuterVolumeSpecName: "kube-api-access-mrjh5") pod "8547082c-6adf-40d2-8dce-e15eb63349b2" (UID: "8547082c-6adf-40d2-8dce-e15eb63349b2"). InnerVolumeSpecName "kube-api-access-mrjh5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:37:40.137857 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.137817 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mrjh5\" (UniqueName: \"kubernetes.io/projected/8547082c-6adf-40d2-8dce-e15eb63349b2-kube-api-access-mrjh5\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:37:40.137857 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.137852 2569 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-runtime-8f851-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8547082c-6adf-40d2-8dce-e15eb63349b2-raw-sklearn-runtime-8f851-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:37:40.138066 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.137867 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8547082c-6adf-40d2-8dce-e15eb63349b2-proxy-tls\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:37:40.406270 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.406180 2569 generic.go:358] "Generic (PLEG): container finished" podID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerID="22a34f0365efdb6084cfc7b9edfe0da3e14fca80cda366fe74d4e203de940c36" exitCode=0 Apr 24 21:37:40.406270 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.406243 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" event={"ID":"8547082c-6adf-40d2-8dce-e15eb63349b2","Type":"ContainerDied","Data":"22a34f0365efdb6084cfc7b9edfe0da3e14fca80cda366fe74d4e203de940c36"} Apr 24 21:37:40.406483 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.406279 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" event={"ID":"8547082c-6adf-40d2-8dce-e15eb63349b2","Type":"ContainerDied","Data":"007acdea28f7d6364b1ffc4e122360def7d6fba375ad3138a865f7434b07604b"} Apr 24 21:37:40.406483 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.406295 2569 scope.go:117] "RemoveContainer" containerID="97030b767ebaf46554eb4c5058330a844dc07ccf59ee112ad303239eacc51d91" Apr 24 21:37:40.406483 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.406292 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff" Apr 24 21:37:40.430359 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.430315 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff"] Apr 24 21:37:40.438153 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.438132 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8f851-predictor-6ccb889f98-5bcff"] Apr 24 21:37:40.840909 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.840746 2569 scope.go:117] "RemoveContainer" containerID="22a34f0365efdb6084cfc7b9edfe0da3e14fca80cda366fe74d4e203de940c36" Apr 24 21:37:40.848373 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.848355 2569 scope.go:117] "RemoveContainer" containerID="4fc7f706e72ed8b03fe418da15a64a615c97e7fb4d6d97227d40256d87cf2754" Apr 24 21:37:40.855049 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.855034 2569 scope.go:117] "RemoveContainer" containerID="97030b767ebaf46554eb4c5058330a844dc07ccf59ee112ad303239eacc51d91" Apr 24 21:37:40.855284 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:37:40.855266 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97030b767ebaf46554eb4c5058330a844dc07ccf59ee112ad303239eacc51d91\": container with ID starting with 97030b767ebaf46554eb4c5058330a844dc07ccf59ee112ad303239eacc51d91 not found: ID does not exist" containerID="97030b767ebaf46554eb4c5058330a844dc07ccf59ee112ad303239eacc51d91" Apr 24 21:37:40.855330 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.855295 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97030b767ebaf46554eb4c5058330a844dc07ccf59ee112ad303239eacc51d91"} err="failed to get container status \"97030b767ebaf46554eb4c5058330a844dc07ccf59ee112ad303239eacc51d91\": rpc error: code = NotFound desc = could not find container \"97030b767ebaf46554eb4c5058330a844dc07ccf59ee112ad303239eacc51d91\": container with ID starting with 97030b767ebaf46554eb4c5058330a844dc07ccf59ee112ad303239eacc51d91 not found: ID does not exist" Apr 24 21:37:40.855330 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.855314 2569 scope.go:117] "RemoveContainer" containerID="22a34f0365efdb6084cfc7b9edfe0da3e14fca80cda366fe74d4e203de940c36" Apr 24 21:37:40.855533 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:37:40.855508 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22a34f0365efdb6084cfc7b9edfe0da3e14fca80cda366fe74d4e203de940c36\": container with ID starting with 22a34f0365efdb6084cfc7b9edfe0da3e14fca80cda366fe74d4e203de940c36 not found: ID does not exist" containerID="22a34f0365efdb6084cfc7b9edfe0da3e14fca80cda366fe74d4e203de940c36" Apr 24 21:37:40.855576 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.855540 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22a34f0365efdb6084cfc7b9edfe0da3e14fca80cda366fe74d4e203de940c36"} err="failed to get container status \"22a34f0365efdb6084cfc7b9edfe0da3e14fca80cda366fe74d4e203de940c36\": rpc error: code = NotFound desc = could not find container \"22a34f0365efdb6084cfc7b9edfe0da3e14fca80cda366fe74d4e203de940c36\": container with ID starting with 22a34f0365efdb6084cfc7b9edfe0da3e14fca80cda366fe74d4e203de940c36 not found: ID does not exist" Apr 24 21:37:40.855576 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.855555 2569 scope.go:117] "RemoveContainer" containerID="4fc7f706e72ed8b03fe418da15a64a615c97e7fb4d6d97227d40256d87cf2754" Apr 24 21:37:40.855782 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:37:40.855750 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fc7f706e72ed8b03fe418da15a64a615c97e7fb4d6d97227d40256d87cf2754\": container with ID starting with 4fc7f706e72ed8b03fe418da15a64a615c97e7fb4d6d97227d40256d87cf2754 not found: ID does not exist" containerID="4fc7f706e72ed8b03fe418da15a64a615c97e7fb4d6d97227d40256d87cf2754" Apr 24 21:37:40.855831 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:40.855785 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fc7f706e72ed8b03fe418da15a64a615c97e7fb4d6d97227d40256d87cf2754"} err="failed to get container status \"4fc7f706e72ed8b03fe418da15a64a615c97e7fb4d6d97227d40256d87cf2754\": rpc error: code = NotFound desc = could not find container \"4fc7f706e72ed8b03fe418da15a64a615c97e7fb4d6d97227d40256d87cf2754\": container with ID starting with 4fc7f706e72ed8b03fe418da15a64a615c97e7fb4d6d97227d40256d87cf2754 not found: ID does not exist" Apr 24 21:37:41.412286 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:41.412193 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5qrn/must-gather-6h8g9" event={"ID":"fb09aa5d-1edc-481f-9168-51c7c4a760aa","Type":"ContainerStarted","Data":"7020ee12a13af26310cf7e99fe2d9827d122fe6099cd69cc93e5a53360598758"} Apr 24 21:37:41.412286 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:41.412244 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5qrn/must-gather-6h8g9" event={"ID":"fb09aa5d-1edc-481f-9168-51c7c4a760aa","Type":"ContainerStarted","Data":"ae5f780b036b7f95e138ebdea16afd794e72b80232842084a7bdea507a11e626"} Apr 24 21:37:41.432843 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:41.432788 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-z5qrn/must-gather-6h8g9" podStartSLOduration=1.7871544670000001 podStartE2EDuration="6.432770615s" podCreationTimestamp="2026-04-24 21:37:35 +0000 UTC" firstStartedPulling="2026-04-24 21:37:36.254724446 +0000 UTC m=+1244.389123046" lastFinishedPulling="2026-04-24 21:37:40.900340595 +0000 UTC m=+1249.034739194" observedRunningTime="2026-04-24 21:37:41.430623989 +0000 UTC m=+1249.565022636" watchObservedRunningTime="2026-04-24 21:37:41.432770615 +0000 UTC m=+1249.567169228" Apr 24 21:37:42.437277 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:42.437204 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" path="/var/lib/kubelet/pods/8547082c-6adf-40d2-8dce-e15eb63349b2/volumes" Apr 24 21:37:59.470557 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:59.470521 2569 generic.go:358] "Generic (PLEG): container finished" podID="fb09aa5d-1edc-481f-9168-51c7c4a760aa" containerID="ae5f780b036b7f95e138ebdea16afd794e72b80232842084a7bdea507a11e626" exitCode=0 Apr 24 21:37:59.470991 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:59.470571 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5qrn/must-gather-6h8g9" event={"ID":"fb09aa5d-1edc-481f-9168-51c7c4a760aa","Type":"ContainerDied","Data":"ae5f780b036b7f95e138ebdea16afd794e72b80232842084a7bdea507a11e626"} Apr 24 21:37:59.470991 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:59.470860 2569 scope.go:117] "RemoveContainer" containerID="ae5f780b036b7f95e138ebdea16afd794e72b80232842084a7bdea507a11e626" Apr 24 21:37:59.552831 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:37:59.552801 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z5qrn_must-gather-6h8g9_fb09aa5d-1edc-481f-9168-51c7c4a760aa/gather/0.log" Apr 24 21:38:00.078200 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.078162 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pg6qz/must-gather-2k6hw"] Apr 24 21:38:00.078537 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.078519 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kube-rbac-proxy" Apr 24 21:38:00.078537 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.078536 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kube-rbac-proxy" Apr 24 21:38:00.078749 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.078549 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="storage-initializer" Apr 24 21:38:00.078749 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.078557 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="storage-initializer" Apr 24 21:38:00.078749 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.078572 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kserve-container" Apr 24 21:38:00.078749 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.078580 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kserve-container" Apr 24 21:38:00.078749 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.078683 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kube-rbac-proxy" Apr 24 21:38:00.078749 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.078696 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8547082c-6adf-40d2-8dce-e15eb63349b2" containerName="kserve-container" Apr 24 21:38:00.082126 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.082105 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pg6qz/must-gather-2k6hw" Apr 24 21:38:00.085033 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.085012 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pg6qz\"/\"openshift-service-ca.crt\"" Apr 24 21:38:00.085121 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.085052 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-pg6qz\"/\"default-dockercfg-xvlcd\"" Apr 24 21:38:00.085166 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.085116 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pg6qz\"/\"kube-root-ca.crt\"" Apr 24 21:38:00.089182 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.088982 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pg6qz/must-gather-2k6hw"] Apr 24 21:38:00.211342 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.211310 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm8t7\" (UniqueName: \"kubernetes.io/projected/91e266e6-b516-48dc-afe1-716cfe0305d6-kube-api-access-rm8t7\") pod \"must-gather-2k6hw\" (UID: \"91e266e6-b516-48dc-afe1-716cfe0305d6\") " pod="openshift-must-gather-pg6qz/must-gather-2k6hw" Apr 24 21:38:00.211524 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.211348 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/91e266e6-b516-48dc-afe1-716cfe0305d6-must-gather-output\") pod \"must-gather-2k6hw\" (UID: \"91e266e6-b516-48dc-afe1-716cfe0305d6\") " pod="openshift-must-gather-pg6qz/must-gather-2k6hw" Apr 24 21:38:00.312352 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.312321 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm8t7\" (UniqueName: \"kubernetes.io/projected/91e266e6-b516-48dc-afe1-716cfe0305d6-kube-api-access-rm8t7\") pod \"must-gather-2k6hw\" (UID: \"91e266e6-b516-48dc-afe1-716cfe0305d6\") " pod="openshift-must-gather-pg6qz/must-gather-2k6hw" Apr 24 21:38:00.312352 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.312352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/91e266e6-b516-48dc-afe1-716cfe0305d6-must-gather-output\") pod \"must-gather-2k6hw\" (UID: \"91e266e6-b516-48dc-afe1-716cfe0305d6\") " pod="openshift-must-gather-pg6qz/must-gather-2k6hw" Apr 24 21:38:00.312658 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.312643 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/91e266e6-b516-48dc-afe1-716cfe0305d6-must-gather-output\") pod \"must-gather-2k6hw\" (UID: \"91e266e6-b516-48dc-afe1-716cfe0305d6\") " pod="openshift-must-gather-pg6qz/must-gather-2k6hw" Apr 24 21:38:00.321518 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.321484 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm8t7\" (UniqueName: \"kubernetes.io/projected/91e266e6-b516-48dc-afe1-716cfe0305d6-kube-api-access-rm8t7\") pod \"must-gather-2k6hw\" (UID: \"91e266e6-b516-48dc-afe1-716cfe0305d6\") " pod="openshift-must-gather-pg6qz/must-gather-2k6hw" Apr 24 21:38:00.391612 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.391561 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pg6qz/must-gather-2k6hw" Apr 24 21:38:00.511841 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:00.511749 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pg6qz/must-gather-2k6hw"] Apr 24 21:38:00.513944 ip-10-0-132-118 kubenswrapper[2569]: W0424 21:38:00.513916 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91e266e6_b516_48dc_afe1_716cfe0305d6.slice/crio-5398441690f96c1fcb932a61c11c2c1de7a783cf7c6d776d703682ffb9809730 WatchSource:0}: Error finding container 5398441690f96c1fcb932a61c11c2c1de7a783cf7c6d776d703682ffb9809730: Status 404 returned error can't find the container with id 5398441690f96c1fcb932a61c11c2c1de7a783cf7c6d776d703682ffb9809730 Apr 24 21:38:01.478940 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:01.478870 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pg6qz/must-gather-2k6hw" event={"ID":"91e266e6-b516-48dc-afe1-716cfe0305d6","Type":"ContainerStarted","Data":"507fc7feea4630c9b022520340750804327191e9c403af59275e9a7727757dae"} Apr 24 21:38:01.478940 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:01.478912 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pg6qz/must-gather-2k6hw" event={"ID":"91e266e6-b516-48dc-afe1-716cfe0305d6","Type":"ContainerStarted","Data":"5398441690f96c1fcb932a61c11c2c1de7a783cf7c6d776d703682ffb9809730"} Apr 24 21:38:02.484556 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:02.484521 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pg6qz/must-gather-2k6hw" event={"ID":"91e266e6-b516-48dc-afe1-716cfe0305d6","Type":"ContainerStarted","Data":"a0d7277aadd7b5382a6bfdef736f3835b44500a3fa7189a12979dedcae405c49"} Apr 24 21:38:02.506901 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:02.506842 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pg6qz/must-gather-2k6hw" podStartSLOduration=1.686082431 podStartE2EDuration="2.506822777s" podCreationTimestamp="2026-04-24 21:38:00 +0000 UTC" firstStartedPulling="2026-04-24 21:38:00.515861233 +0000 UTC m=+1268.650259834" lastFinishedPulling="2026-04-24 21:38:01.336601581 +0000 UTC m=+1269.471000180" observedRunningTime="2026-04-24 21:38:02.503882089 +0000 UTC m=+1270.638280711" watchObservedRunningTime="2026-04-24 21:38:02.506822777 +0000 UTC m=+1270.641221399" Apr 24 21:38:02.830776 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:02.830707 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-jj6lg_f98087fe-7185-47dc-a94a-dc47079533a9/global-pull-secret-syncer/0.log" Apr 24 21:38:02.926263 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:02.926232 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-lhcn8_7cbedbda-69db-41ea-8652-f7e83cd6b251/konnectivity-agent/0.log" Apr 24 21:38:03.007346 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:03.007309 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-118.ec2.internal_0eda8f5a4a0c114d01bb72ca7f3afc81/haproxy/0.log" Apr 24 21:38:04.920864 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:04.920811 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-z5qrn/must-gather-6h8g9"] Apr 24 21:38:04.921422 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:04.921184 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-z5qrn/must-gather-6h8g9" podUID="fb09aa5d-1edc-481f-9168-51c7c4a760aa" containerName="copy" containerID="cri-o://7020ee12a13af26310cf7e99fe2d9827d122fe6099cd69cc93e5a53360598758" gracePeriod=2 Apr 24 21:38:04.923419 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:04.923393 2569 status_manager.go:895] "Failed to get status for pod" podUID="fb09aa5d-1edc-481f-9168-51c7c4a760aa" pod="openshift-must-gather-z5qrn/must-gather-6h8g9" err="pods \"must-gather-6h8g9\" is forbidden: User \"system:node:ip-10-0-132-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-z5qrn\": no relationship found between node 'ip-10-0-132-118.ec2.internal' and this object" Apr 24 21:38:04.926338 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:04.926321 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-z5qrn/must-gather-6h8g9"] Apr 24 21:38:05.296326 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.294828 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z5qrn_must-gather-6h8g9_fb09aa5d-1edc-481f-9168-51c7c4a760aa/copy/0.log" Apr 24 21:38:05.296326 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.295260 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5qrn/must-gather-6h8g9" Apr 24 21:38:05.299364 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.297715 2569 status_manager.go:895] "Failed to get status for pod" podUID="fb09aa5d-1edc-481f-9168-51c7c4a760aa" pod="openshift-must-gather-z5qrn/must-gather-6h8g9" err="pods \"must-gather-6h8g9\" is forbidden: User \"system:node:ip-10-0-132-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-z5qrn\": no relationship found between node 'ip-10-0-132-118.ec2.internal' and this object" Apr 24 21:38:05.361319 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.361151 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fb09aa5d-1edc-481f-9168-51c7c4a760aa-must-gather-output\") pod \"fb09aa5d-1edc-481f-9168-51c7c4a760aa\" (UID: \"fb09aa5d-1edc-481f-9168-51c7c4a760aa\") " Apr 24 21:38:05.361319 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.361197 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ldqg\" (UniqueName: \"kubernetes.io/projected/fb09aa5d-1edc-481f-9168-51c7c4a760aa-kube-api-access-8ldqg\") pod \"fb09aa5d-1edc-481f-9168-51c7c4a760aa\" (UID: \"fb09aa5d-1edc-481f-9168-51c7c4a760aa\") " Apr 24 21:38:05.365375 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.362935 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb09aa5d-1edc-481f-9168-51c7c4a760aa-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "fb09aa5d-1edc-481f-9168-51c7c4a760aa" (UID: "fb09aa5d-1edc-481f-9168-51c7c4a760aa"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:38:05.369776 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.366868 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb09aa5d-1edc-481f-9168-51c7c4a760aa-kube-api-access-8ldqg" (OuterVolumeSpecName: "kube-api-access-8ldqg") pod "fb09aa5d-1edc-481f-9168-51c7c4a760aa" (UID: "fb09aa5d-1edc-481f-9168-51c7c4a760aa"). InnerVolumeSpecName "kube-api-access-8ldqg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:38:05.462829 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.462789 2569 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fb09aa5d-1edc-481f-9168-51c7c4a760aa-must-gather-output\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:38:05.462829 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.462828 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8ldqg\" (UniqueName: \"kubernetes.io/projected/fb09aa5d-1edc-481f-9168-51c7c4a760aa-kube-api-access-8ldqg\") on node \"ip-10-0-132-118.ec2.internal\" DevicePath \"\"" Apr 24 21:38:05.508456 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.499053 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z5qrn_must-gather-6h8g9_fb09aa5d-1edc-481f-9168-51c7c4a760aa/copy/0.log" Apr 24 21:38:05.508456 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.499459 2569 generic.go:358] "Generic (PLEG): container finished" podID="fb09aa5d-1edc-481f-9168-51c7c4a760aa" containerID="7020ee12a13af26310cf7e99fe2d9827d122fe6099cd69cc93e5a53360598758" exitCode=143 Apr 24 21:38:05.508456 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.499559 2569 scope.go:117] "RemoveContainer" containerID="7020ee12a13af26310cf7e99fe2d9827d122fe6099cd69cc93e5a53360598758" Apr 24 21:38:05.508456 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.499702 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5qrn/must-gather-6h8g9" Apr 24 21:38:05.509612 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.509556 2569 status_manager.go:895] "Failed to get status for pod" podUID="fb09aa5d-1edc-481f-9168-51c7c4a760aa" pod="openshift-must-gather-z5qrn/must-gather-6h8g9" err="pods \"must-gather-6h8g9\" is forbidden: User \"system:node:ip-10-0-132-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-z5qrn\": no relationship found between node 'ip-10-0-132-118.ec2.internal' and this object" Apr 24 21:38:05.519615 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.518157 2569 scope.go:117] "RemoveContainer" containerID="ae5f780b036b7f95e138ebdea16afd794e72b80232842084a7bdea507a11e626" Apr 24 21:38:05.534092 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.534004 2569 status_manager.go:895] "Failed to get status for pod" podUID="fb09aa5d-1edc-481f-9168-51c7c4a760aa" pod="openshift-must-gather-z5qrn/must-gather-6h8g9" err="pods \"must-gather-6h8g9\" is forbidden: User \"system:node:ip-10-0-132-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-z5qrn\": no relationship found between node 'ip-10-0-132-118.ec2.internal' and this object" Apr 24 21:38:05.551066 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.547639 2569 scope.go:117] "RemoveContainer" containerID="7020ee12a13af26310cf7e99fe2d9827d122fe6099cd69cc93e5a53360598758" Apr 24 21:38:05.551185 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:38:05.551153 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7020ee12a13af26310cf7e99fe2d9827d122fe6099cd69cc93e5a53360598758\": container with ID starting with 7020ee12a13af26310cf7e99fe2d9827d122fe6099cd69cc93e5a53360598758 not found: ID does not exist" containerID="7020ee12a13af26310cf7e99fe2d9827d122fe6099cd69cc93e5a53360598758" Apr 24 21:38:05.551255 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.551200 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7020ee12a13af26310cf7e99fe2d9827d122fe6099cd69cc93e5a53360598758"} err="failed to get container status \"7020ee12a13af26310cf7e99fe2d9827d122fe6099cd69cc93e5a53360598758\": rpc error: code = NotFound desc = could not find container \"7020ee12a13af26310cf7e99fe2d9827d122fe6099cd69cc93e5a53360598758\": container with ID starting with 7020ee12a13af26310cf7e99fe2d9827d122fe6099cd69cc93e5a53360598758 not found: ID does not exist" Apr 24 21:38:05.551255 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.551229 2569 scope.go:117] "RemoveContainer" containerID="ae5f780b036b7f95e138ebdea16afd794e72b80232842084a7bdea507a11e626" Apr 24 21:38:05.551704 ip-10-0-132-118 kubenswrapper[2569]: E0424 21:38:05.551605 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae5f780b036b7f95e138ebdea16afd794e72b80232842084a7bdea507a11e626\": container with ID starting with ae5f780b036b7f95e138ebdea16afd794e72b80232842084a7bdea507a11e626 not found: ID does not exist" containerID="ae5f780b036b7f95e138ebdea16afd794e72b80232842084a7bdea507a11e626" Apr 24 21:38:05.551704 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:05.551637 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5f780b036b7f95e138ebdea16afd794e72b80232842084a7bdea507a11e626"} err="failed to get container status \"ae5f780b036b7f95e138ebdea16afd794e72b80232842084a7bdea507a11e626\": rpc error: code = NotFound desc = could not find container \"ae5f780b036b7f95e138ebdea16afd794e72b80232842084a7bdea507a11e626\": container with ID starting with ae5f780b036b7f95e138ebdea16afd794e72b80232842084a7bdea507a11e626 not found: ID does not exist" Apr 24 21:38:06.336051 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:06.336015 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e4bea200-e5a0-4a4d-a216-32f239a14c77/alertmanager/0.log" Apr 24 21:38:06.373180 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:06.373144 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e4bea200-e5a0-4a4d-a216-32f239a14c77/config-reloader/0.log" Apr 24 21:38:06.415064 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:06.415033 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e4bea200-e5a0-4a4d-a216-32f239a14c77/kube-rbac-proxy-web/0.log" Apr 24 21:38:06.437416 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:06.437375 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb09aa5d-1edc-481f-9168-51c7c4a760aa" path="/var/lib/kubelet/pods/fb09aa5d-1edc-481f-9168-51c7c4a760aa/volumes" Apr 24 21:38:06.485837 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:06.485805 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e4bea200-e5a0-4a4d-a216-32f239a14c77/kube-rbac-proxy/0.log" Apr 24 21:38:06.534091 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:06.534025 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e4bea200-e5a0-4a4d-a216-32f239a14c77/kube-rbac-proxy-metric/0.log" Apr 24 21:38:06.569734 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:06.569705 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e4bea200-e5a0-4a4d-a216-32f239a14c77/prom-label-proxy/0.log" Apr 24 21:38:06.604185 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:06.604066 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e4bea200-e5a0-4a4d-a216-32f239a14c77/init-config-reloader/0.log" Apr 24 21:38:06.686379 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:06.686336 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mmt7h_7ba14ffd-52bc-443a-827d-b237a10721e1/kube-state-metrics/0.log" Apr 24 21:38:06.708832 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:06.708792 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mmt7h_7ba14ffd-52bc-443a-827d-b237a10721e1/kube-rbac-proxy-main/0.log" Apr 24 21:38:06.744502 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:06.744478 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mmt7h_7ba14ffd-52bc-443a-827d-b237a10721e1/kube-rbac-proxy-self/0.log" Apr 24 21:38:06.782161 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:06.782134 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5c68b88bf5-lpq5h_1b4e5d80-ddd3-4111-9592-2ed3e29a3669/metrics-server/0.log" Apr 24 21:38:06.811441 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:06.811358 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-2dmnr_8ddb507d-944f-4cf5-8f38-319dacf0c8b1/monitoring-plugin/0.log" Apr 24 21:38:07.061044 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.061010 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-z9gbt_1352a5df-0a6a-4562-ae3d-1061283310eb/node-exporter/0.log" Apr 24 21:38:07.095906 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.095826 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-z9gbt_1352a5df-0a6a-4562-ae3d-1061283310eb/kube-rbac-proxy/0.log" Apr 24 21:38:07.130118 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.130096 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-z9gbt_1352a5df-0a6a-4562-ae3d-1061283310eb/init-textfile/0.log" Apr 24 21:38:07.167190 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.167085 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-dzbsw_b1111e14-fede-42ce-8a23-a8d08526e8c6/kube-rbac-proxy-main/0.log" Apr 24 21:38:07.190344 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.190308 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-dzbsw_b1111e14-fede-42ce-8a23-a8d08526e8c6/kube-rbac-proxy-self/0.log" Apr 24 21:38:07.213275 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.213241 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-dzbsw_b1111e14-fede-42ce-8a23-a8d08526e8c6/openshift-state-metrics/0.log" Apr 24 21:38:07.259528 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.259488 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c8b31da7-e241-4355-9dcc-31ff04cfdc36/prometheus/0.log" Apr 24 21:38:07.285050 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.285021 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c8b31da7-e241-4355-9dcc-31ff04cfdc36/config-reloader/0.log" Apr 24 21:38:07.313285 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.313258 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c8b31da7-e241-4355-9dcc-31ff04cfdc36/thanos-sidecar/0.log" Apr 24 21:38:07.340035 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.340005 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c8b31da7-e241-4355-9dcc-31ff04cfdc36/kube-rbac-proxy-web/0.log" Apr 24 21:38:07.365252 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.365145 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c8b31da7-e241-4355-9dcc-31ff04cfdc36/kube-rbac-proxy/0.log" Apr 24 21:38:07.419173 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.419134 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c8b31da7-e241-4355-9dcc-31ff04cfdc36/kube-rbac-proxy-thanos/0.log" Apr 24 21:38:07.469592 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.469557 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c8b31da7-e241-4355-9dcc-31ff04cfdc36/init-config-reloader/0.log" Apr 24 21:38:07.650649 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.650564 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-764bbb7fd8-mfckr_e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b/telemeter-client/0.log" Apr 24 21:38:07.735613 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.735571 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-764bbb7fd8-mfckr_e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b/reload/0.log" Apr 24 21:38:07.794802 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.794770 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-764bbb7fd8-mfckr_e7db6a57-49dd-4d4c-9fdc-1b5a3c89670b/kube-rbac-proxy/0.log" Apr 24 21:38:07.879541 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.879511 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b99dfb4fb-nzvxg_dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa/thanos-query/0.log" Apr 24 21:38:07.925963 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.925884 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b99dfb4fb-nzvxg_dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa/kube-rbac-proxy-web/0.log" Apr 24 21:38:07.958093 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.958067 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b99dfb4fb-nzvxg_dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa/kube-rbac-proxy/0.log" Apr 24 21:38:07.991950 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:07.991921 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b99dfb4fb-nzvxg_dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa/prom-label-proxy/0.log" Apr 24 21:38:08.033386 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:08.033345 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b99dfb4fb-nzvxg_dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa/kube-rbac-proxy-rules/0.log" Apr 24 21:38:08.064751 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:08.064712 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b99dfb4fb-nzvxg_dfc47c98-7bc9-4cf3-83b3-d0ea02ed0bfa/kube-rbac-proxy-metrics/0.log" Apr 24 21:38:09.646930 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.646892 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs"] Apr 24 21:38:09.647384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.647329 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb09aa5d-1edc-481f-9168-51c7c4a760aa" containerName="copy" Apr 24 21:38:09.647384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.647343 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb09aa5d-1edc-481f-9168-51c7c4a760aa" containerName="copy" Apr 24 21:38:09.647384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.647355 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb09aa5d-1edc-481f-9168-51c7c4a760aa" containerName="gather" Apr 24 21:38:09.647384 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.647360 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb09aa5d-1edc-481f-9168-51c7c4a760aa" containerName="gather" Apr 24 21:38:09.647517 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.647413 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb09aa5d-1edc-481f-9168-51c7c4a760aa" containerName="copy" Apr 24 21:38:09.647517 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.647423 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb09aa5d-1edc-481f-9168-51c7c4a760aa" containerName="gather" Apr 24 21:38:09.651714 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.651691 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.665330 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.665307 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs"] Apr 24 21:38:09.704653 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.704625 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/582f51de-ccb9-4aea-9640-004c7ee21f17-proc\") pod \"perf-node-gather-daemonset-blccs\" (UID: \"582f51de-ccb9-4aea-9640-004c7ee21f17\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.704826 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.704662 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/582f51de-ccb9-4aea-9640-004c7ee21f17-podres\") pod \"perf-node-gather-daemonset-blccs\" (UID: \"582f51de-ccb9-4aea-9640-004c7ee21f17\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.704826 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.704684 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7t8m\" (UniqueName: \"kubernetes.io/projected/582f51de-ccb9-4aea-9640-004c7ee21f17-kube-api-access-r7t8m\") pod \"perf-node-gather-daemonset-blccs\" (UID: \"582f51de-ccb9-4aea-9640-004c7ee21f17\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.704826 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.704744 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/582f51de-ccb9-4aea-9640-004c7ee21f17-lib-modules\") pod \"perf-node-gather-daemonset-blccs\" (UID: \"582f51de-ccb9-4aea-9640-004c7ee21f17\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.704826 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.704802 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/582f51de-ccb9-4aea-9640-004c7ee21f17-sys\") pod \"perf-node-gather-daemonset-blccs\" (UID: \"582f51de-ccb9-4aea-9640-004c7ee21f17\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.805722 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.805683 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/582f51de-ccb9-4aea-9640-004c7ee21f17-podres\") pod \"perf-node-gather-daemonset-blccs\" (UID: \"582f51de-ccb9-4aea-9640-004c7ee21f17\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.805907 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.805732 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7t8m\" (UniqueName: \"kubernetes.io/projected/582f51de-ccb9-4aea-9640-004c7ee21f17-kube-api-access-r7t8m\") pod \"perf-node-gather-daemonset-blccs\" (UID: \"582f51de-ccb9-4aea-9640-004c7ee21f17\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.805907 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.805807 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/582f51de-ccb9-4aea-9640-004c7ee21f17-lib-modules\") pod \"perf-node-gather-daemonset-blccs\" (UID: \"582f51de-ccb9-4aea-9640-004c7ee21f17\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.805907 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.805852 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/582f51de-ccb9-4aea-9640-004c7ee21f17-podres\") pod \"perf-node-gather-daemonset-blccs\" (UID: \"582f51de-ccb9-4aea-9640-004c7ee21f17\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.805907 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.805855 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/582f51de-ccb9-4aea-9640-004c7ee21f17-sys\") pod \"perf-node-gather-daemonset-blccs\" (UID: \"582f51de-ccb9-4aea-9640-004c7ee21f17\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.806107 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.805911 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/582f51de-ccb9-4aea-9640-004c7ee21f17-sys\") pod \"perf-node-gather-daemonset-blccs\" (UID: \"582f51de-ccb9-4aea-9640-004c7ee21f17\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.806107 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.805960 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/582f51de-ccb9-4aea-9640-004c7ee21f17-lib-modules\") pod \"perf-node-gather-daemonset-blccs\" (UID: \"582f51de-ccb9-4aea-9640-004c7ee21f17\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.806107 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.805975 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/582f51de-ccb9-4aea-9640-004c7ee21f17-proc\") pod \"perf-node-gather-daemonset-blccs\" (UID: \"582f51de-ccb9-4aea-9640-004c7ee21f17\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.806107 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.806009 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/582f51de-ccb9-4aea-9640-004c7ee21f17-proc\") pod \"perf-node-gather-daemonset-blccs\" (UID: \"582f51de-ccb9-4aea-9640-004c7ee21f17\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.823679 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.823646 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7t8m\" (UniqueName: \"kubernetes.io/projected/582f51de-ccb9-4aea-9640-004c7ee21f17-kube-api-access-r7t8m\") pod \"perf-node-gather-daemonset-blccs\" (UID: \"582f51de-ccb9-4aea-9640-004c7ee21f17\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:09.961607 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:09.961536 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:10.084419 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:10.084348 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs"] Apr 24 21:38:10.520043 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:10.520007 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" event={"ID":"582f51de-ccb9-4aea-9640-004c7ee21f17","Type":"ContainerStarted","Data":"2bd5db90cdb7820a453493a59c59872de46b3fb6c15a7bfe7e1c38ce4f82ce7a"} Apr 24 21:38:10.520043 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:10.520044 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" event={"ID":"582f51de-ccb9-4aea-9640-004c7ee21f17","Type":"ContainerStarted","Data":"1c6618801f89d65e6348c9d1943fbf3f9a6b8f7003bf476d0c6b5be407277b4e"} Apr 24 21:38:10.520295 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:10.520166 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:10.539871 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:10.539819 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" podStartSLOduration=1.539798003 podStartE2EDuration="1.539798003s" podCreationTimestamp="2026-04-24 21:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:38:10.536748139 +0000 UTC m=+1278.671146759" watchObservedRunningTime="2026-04-24 21:38:10.539798003 +0000 UTC m=+1278.674196626" Apr 24 21:38:11.386308 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:11.386277 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cdfn8_65af2f93-95ea-42a0-a1bc-090aac46e966/dns/0.log" Apr 24 21:38:11.417881 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:11.417856 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cdfn8_65af2f93-95ea-42a0-a1bc-090aac46e966/kube-rbac-proxy/0.log" Apr 24 21:38:11.535726 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:11.535699 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hpsfh_7f3244bf-6a30-4809-b6e8-2f27daaaf6ae/dns-node-resolver/0.log" Apr 24 21:38:12.068962 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:12.068934 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9s62d_c4045dfd-b9dc-46c0-9964-9335ef05615d/node-ca/0.log" Apr 24 21:38:13.263157 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:13.263124 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6hhf2_3ab73dd3-791f-4c51-800b-67742dfd636b/serve-healthcheck-canary/0.log" Apr 24 21:38:13.894299 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:13.894275 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8928p_1de28cba-9711-40f3-affe-fd088ee9e25b/kube-rbac-proxy/0.log" Apr 24 21:38:13.915101 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:13.915071 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8928p_1de28cba-9711-40f3-affe-fd088ee9e25b/exporter/0.log" Apr 24 21:38:13.939992 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:13.939969 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8928p_1de28cba-9711-40f3-affe-fd088ee9e25b/extractor/0.log" Apr 24 21:38:16.247540 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:16.247511 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-gs7ns_0a228f5a-483e-497b-a305-5d8f68b5ed9f/manager/0.log" Apr 24 21:38:16.315140 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:16.315115 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-vzmdl_219f70c0-46a5-4c30-9da0-a5a37e49260c/s3-init/0.log" Apr 24 21:38:16.534929 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:16.534901 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-blccs" Apr 24 21:38:21.792432 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:21.792406 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bhz9g_68711042-2c0d-43ee-aac6-684c532f8d59/kube-multus-additional-cni-plugins/0.log" Apr 24 21:38:21.817004 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:21.816985 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bhz9g_68711042-2c0d-43ee-aac6-684c532f8d59/egress-router-binary-copy/0.log" Apr 24 21:38:21.837606 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:21.837584 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bhz9g_68711042-2c0d-43ee-aac6-684c532f8d59/cni-plugins/0.log" Apr 24 21:38:21.862318 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:21.862299 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bhz9g_68711042-2c0d-43ee-aac6-684c532f8d59/bond-cni-plugin/0.log" Apr 24 21:38:21.883848 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:21.883832 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bhz9g_68711042-2c0d-43ee-aac6-684c532f8d59/routeoverride-cni/0.log" Apr 24 21:38:21.907798 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:21.907778 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bhz9g_68711042-2c0d-43ee-aac6-684c532f8d59/whereabouts-cni-bincopy/0.log" Apr 24 21:38:21.929430 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:21.929409 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bhz9g_68711042-2c0d-43ee-aac6-684c532f8d59/whereabouts-cni/0.log" Apr 24 21:38:22.327580 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:22.327545 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nwzth_10fe30c1-99bf-4857-b080-8aafa2ee3910/kube-multus/0.log" Apr 24 21:38:22.501930 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:22.501897 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zqp7l_ddd581ca-fe5d-4e33-965d-ad198f8af209/network-metrics-daemon/0.log" Apr 24 21:38:22.522432 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:22.522401 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zqp7l_ddd581ca-fe5d-4e33-965d-ad198f8af209/kube-rbac-proxy/0.log" Apr 24 21:38:23.274866 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:23.274838 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fsxch_e1e0c938-ec29-416f-8376-93b93ff2d991/ovn-controller/0.log" Apr 24 21:38:23.304621 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:23.304599 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fsxch_e1e0c938-ec29-416f-8376-93b93ff2d991/ovn-acl-logging/0.log" Apr 24 21:38:23.325471 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:23.325444 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fsxch_e1e0c938-ec29-416f-8376-93b93ff2d991/kube-rbac-proxy-node/0.log" Apr 24 21:38:23.355836 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:23.355813 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fsxch_e1e0c938-ec29-416f-8376-93b93ff2d991/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 21:38:23.377169 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:23.377149 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fsxch_e1e0c938-ec29-416f-8376-93b93ff2d991/northd/0.log" Apr 24 21:38:23.399647 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:23.399624 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fsxch_e1e0c938-ec29-416f-8376-93b93ff2d991/nbdb/0.log" Apr 24 21:38:23.424164 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:23.424146 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fsxch_e1e0c938-ec29-416f-8376-93b93ff2d991/sbdb/0.log" Apr 24 21:38:23.605734 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:23.605705 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fsxch_e1e0c938-ec29-416f-8376-93b93ff2d991/ovnkube-controller/0.log" Apr 24 21:38:25.303815 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:25.303784 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-m9nk2_ac26e9c0-3977-40b9-a44e-d694b6663276/network-check-target-container/0.log" Apr 24 21:38:26.259089 ip-10-0-132-118 kubenswrapper[2569]: I0424 21:38:26.258997 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-288np_5cc754c0-fce1-4135-a603-509ef613e62d/iptables-alerter/0.log"