Apr 17 17:25:07.256982 ip-10-0-137-46 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:25:07.694770 ip-10-0-137-46 kubenswrapper[2546]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:25:07.694770 ip-10-0-137-46 kubenswrapper[2546]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:25:07.694770 ip-10-0-137-46 kubenswrapper[2546]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:25:07.694770 ip-10-0-137-46 kubenswrapper[2546]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:25:07.694770 ip-10-0-137-46 kubenswrapper[2546]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:25:07.695650 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.695556 2546 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:25:07.699295 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699277 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:07.699295 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699295 2546 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699301 2546 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699306 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699310 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699313 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699316 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699319 2546 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699323 2546 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699326 2546 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699329 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699332 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699334 2546 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699337 2546 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699340 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699344 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699347 2546 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699349 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699352 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699355 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:07.699368 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699357 2546 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699360 2546 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699362 2546 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699365 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699368 2546 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699371 2546 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699373 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699376 2546 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699379 2546 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699382 2546 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699385 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699388 2546 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699391 2546 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699393 2546 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699396 2546 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699399 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699402 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699405 2546 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699409 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699412 2546 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:07.699874 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699414 2546 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699417 2546 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699419 2546 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699422 2546 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699424 2546 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699427 2546 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699429 2546 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699432 2546 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699434 2546 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699437 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699440 2546 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699442 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699445 2546 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699448 2546 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699451 2546 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699454 2546 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699457 2546 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699459 2546 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699462 2546 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699464 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699467 2546 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:07.700382 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699469 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699472 2546 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699476 2546 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699480 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699483 2546 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699487 2546 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699490 2546 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699492 2546 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699495 2546 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699499 2546 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699502 2546 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699504 2546 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699507 2546 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699510 2546 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699513 2546 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699516 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699519 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699521 2546 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699524 2546 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:07.700950 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699527 2546 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699529 2546 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699532 2546 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699535 2546 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699539 2546 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699542 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699977 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699983 2546 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699985 2546 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699988 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699991 2546 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699994 2546 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699996 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.699999 2546 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700002 2546 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700004 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700007 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700010 2546 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700012 2546 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:07.701414 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700015 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700017 2546 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700020 2546 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700023 2546 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700026 2546 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700028 2546 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700031 2546 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700033 2546 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700036 2546 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700038 2546 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700041 2546 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700043 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700046 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700048 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700051 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700054 2546 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700057 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700059 2546 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700062 2546 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700065 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:07.701888 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700068 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700070 2546 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700073 2546 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700075 2546 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700078 2546 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700080 2546 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700083 2546 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700085 2546 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700088 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700091 2546 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700093 2546 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700096 2546 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700098 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700101 2546 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700105 2546 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700108 2546 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700111 2546 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700114 2546 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700117 2546 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:07.702375 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700120 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700123 2546 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700126 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700129 2546 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700132 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700134 2546 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700137 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700140 2546 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700143 2546 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700145 2546 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700148 2546 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700151 2546 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700155 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700158 2546 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700161 2546 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700163 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700166 2546 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700169 2546 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700171 2546 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700174 2546 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:07.702957 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700176 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700179 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700182 2546 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700184 2546 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700187 2546 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700190 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700192 2546 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700195 2546 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700197 2546 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700200 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700203 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700205 2546 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700207 2546 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.700210 2546 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701740 2546 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701750 2546 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701758 2546 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701763 2546 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701768 2546 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701771 2546 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701776 2546 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:25:07.703741 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701781 2546 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701784 2546 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701787 2546 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701791 2546 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701794 2546 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701797 2546 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701800 2546 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701803 2546 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701806 2546 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701809 2546 flags.go:64] FLAG: --cloud-config="" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701812 2546 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701816 2546 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701820 2546 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701823 2546 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701826 2546 flags.go:64] FLAG: --config-dir="" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701829 2546 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701832 2546 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701841 2546 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701845 2546 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701848 2546 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701852 2546 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701855 2546 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701858 2546 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701862 2546 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701865 2546 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:25:07.704305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701868 2546 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701872 2546 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701876 2546 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701879 2546 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701882 2546 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701885 2546 flags.go:64] FLAG: --enable-server="true" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701889 2546 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701895 2546 flags.go:64] FLAG: --event-burst="100" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701898 2546 flags.go:64] FLAG: --event-qps="50" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701902 2546 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701905 2546 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701908 2546 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701912 2546 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701915 2546 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701918 2546 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701921 2546 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701924 2546 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701927 2546 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701930 2546 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701933 2546 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701936 2546 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701938 2546 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701941 2546 flags.go:64] FLAG: --feature-gates="" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701945 2546 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701948 2546 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:25:07.704911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701951 2546 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701955 2546 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701958 2546 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701962 2546 flags.go:64] FLAG: --help="false" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701965 2546 flags.go:64] FLAG: --hostname-override="ip-10-0-137-46.ec2.internal" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701968 2546 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701972 2546 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701974 2546 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701978 2546 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701981 2546 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701984 2546 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701987 2546 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701990 2546 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701993 2546 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701996 2546 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.701999 2546 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702003 2546 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702006 2546 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702009 2546 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702012 2546 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702014 2546 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702017 2546 flags.go:64] FLAG: --lock-file="" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702021 2546 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702023 2546 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:25:07.705533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702027 2546 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702032 2546 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702035 2546 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702038 2546 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702041 2546 flags.go:64] FLAG: --logging-format="text" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702044 2546 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702047 2546 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702050 2546 flags.go:64] FLAG: --manifest-url="" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702053 2546 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702057 2546 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702060 2546 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702065 2546 flags.go:64] FLAG: --max-pods="110" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702068 2546 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702071 2546 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702074 2546 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702077 2546 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702080 2546 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702083 2546 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702087 2546 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702096 2546 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702100 2546 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702103 2546 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702107 2546 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:25:07.706138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702110 2546 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702116 2546 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702119 2546 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702123 2546 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702126 2546 flags.go:64] FLAG: --port="10250" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702129 2546 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702132 2546 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02e2dc005b06056e7" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702135 2546 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702138 2546 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702141 2546 flags.go:64] FLAG: --register-node="true" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702144 2546 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702147 2546 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702151 2546 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702153 2546 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702156 2546 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702159 2546 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702162 2546 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702165 2546 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702169 2546 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702171 2546 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702174 2546 flags.go:64] FLAG: --runonce="false" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702177 2546 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702180 2546 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702183 2546 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702186 2546 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702190 2546 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:25:07.706695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702193 2546 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702196 2546 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702199 2546 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702202 2546 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702205 2546 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702208 2546 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702210 2546 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702230 2546 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702234 2546 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702238 2546 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702243 2546 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702246 2546 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702249 2546 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702253 2546 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702256 2546 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702259 2546 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702262 2546 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702264 2546 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702268 2546 flags.go:64] FLAG: --v="2" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702272 2546 flags.go:64] FLAG: --version="false" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702276 2546 flags.go:64] FLAG: --vmodule="" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702282 2546 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702285 2546 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702387 2546 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:07.707322 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702391 2546 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702395 2546 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702398 2546 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702401 2546 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702404 2546 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702407 2546 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702410 2546 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702413 2546 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702416 2546 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702420 2546 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702423 2546 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702426 2546 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702429 2546 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702432 2546 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702435 2546 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702437 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702440 2546 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702443 2546 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702446 2546 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:07.707907 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702450 2546 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702453 2546 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702456 2546 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702458 2546 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702461 2546 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702464 2546 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702466 2546 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702469 2546 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702471 2546 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702474 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702476 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702479 2546 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702481 2546 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702486 2546 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702489 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702491 2546 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702494 2546 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702496 2546 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702499 2546 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702501 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:07.708429 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702504 2546 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702507 2546 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702510 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702513 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702515 2546 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702517 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702520 2546 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702523 2546 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702525 2546 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702528 2546 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702530 2546 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702533 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702536 2546 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702538 2546 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702540 2546 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702543 2546 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702545 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702548 2546 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702551 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:07.708954 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702553 2546 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702556 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702558 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702561 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702563 2546 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702565 2546 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702569 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702571 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702574 2546 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702577 2546 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702579 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702582 2546 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702584 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702587 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702589 2546 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702596 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702599 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702601 2546 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702604 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702606 2546 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:07.709435 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702609 2546 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702612 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702614 2546 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702617 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702619 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702622 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.702624 2546 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.702636 2546 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.709575 2546 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.709593 2546 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709641 2546 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709646 2546 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709650 2546 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709653 2546 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709656 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709660 2546 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:07.709941 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709663 2546 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709665 2546 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709668 2546 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709670 2546 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709673 2546 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709690 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709693 2546 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709695 2546 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709699 2546 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709701 2546 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709704 2546 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709707 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709709 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709712 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709715 2546 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709718 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709721 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709723 2546 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709726 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709729 2546 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:07.710360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709731 2546 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709735 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709737 2546 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709740 2546 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709742 2546 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709746 2546 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709748 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709751 2546 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709754 2546 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709757 2546 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709759 2546 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709762 2546 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709766 2546 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709770 2546 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709773 2546 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709775 2546 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709778 2546 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709781 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709784 2546 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709786 2546 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:07.710865 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709789 2546 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709792 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709794 2546 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709798 2546 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709803 2546 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709806 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709809 2546 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709813 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709815 2546 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709818 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709821 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709824 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709826 2546 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709830 2546 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709833 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709836 2546 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709838 2546 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709842 2546 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709845 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709848 2546 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:07.711378 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709851 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709854 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709857 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709859 2546 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709862 2546 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709864 2546 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709867 2546 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709869 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709872 2546 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709874 2546 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709877 2546 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709880 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709882 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709885 2546 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709887 2546 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709890 2546 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709892 2546 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709896 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709899 2546 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:07.711893 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.709901 2546 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.709907 2546 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710007 2546 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710012 2546 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710015 2546 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710018 2546 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710020 2546 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710023 2546 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710026 2546 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710028 2546 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710031 2546 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710034 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710037 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710039 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710042 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710045 2546 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:07.712390 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710047 2546 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710050 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710053 2546 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710056 2546 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710058 2546 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710061 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710063 2546 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710067 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710069 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710072 2546 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710075 2546 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710077 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710080 2546 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710082 2546 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710085 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710088 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710090 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710093 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710096 2546 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710099 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:07.712886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710101 2546 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710104 2546 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710106 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710109 2546 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710113 2546 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710117 2546 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710119 2546 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710123 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710126 2546 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710129 2546 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710132 2546 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710135 2546 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710138 2546 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710140 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710143 2546 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710145 2546 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710148 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710150 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710153 2546 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710155 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:07.713462 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710159 2546 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710161 2546 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710164 2546 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710166 2546 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710169 2546 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710171 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710174 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710176 2546 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710179 2546 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710181 2546 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710184 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710186 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710189 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710191 2546 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710194 2546 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710197 2546 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710199 2546 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710202 2546 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710205 2546 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710208 2546 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:07.713988 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710210 2546 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:07.714498 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710213 2546 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:07.714498 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710216 2546 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:07.714498 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710218 2546 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:07.714498 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710221 2546 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:07.714498 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710223 2546 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:07.714498 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710226 2546 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:07.714498 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710228 2546 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:07.714498 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710231 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:07.714498 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710233 2546 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:07.714498 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710237 2546 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:07.714498 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:07.710241 2546 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:07.714498 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.710246 2546 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:25:07.714498 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.711032 2546 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:25:07.714498 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.713089 2546 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:25:07.714498 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.713984 2546 server.go:1019] "Starting client certificate rotation" Apr 17 17:25:07.714891 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.714072 2546 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:25:07.714891 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.714110 2546 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:25:07.739102 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.739079 2546 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:25:07.745553 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.745531 2546 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:25:07.763459 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.763437 2546 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:25:07.769024 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.769005 2546 log.go:25] "Validated CRI v1 image API" Apr 17 17:25:07.771556 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.771533 2546 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:25:07.775281 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.775259 2546 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 e11b57a6-187e-44e0-b7e0-8a9ef61684c7:/dev/nvme0n1p3 f008b3f2-b6a7-465d-8651-2a23c3399c95:/dev/nvme0n1p4] Apr 17 17:25:07.775361 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.775280 2546 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:25:07.781440 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.781417 2546 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:25:07.782340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.782193 2546 manager.go:217] Machine: {Timestamp:2026-04-17 17:25:07.780275321 +0000 UTC m=+0.402555986 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098812 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26a4a991c82c20923e96dbdf1eac25 SystemUUID:ec26a4a9-91c8-2c20-923e-96dbdf1eac25 BootID:6b010638-6f29-40c3-ba2f-31cf768f243d Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:de:e2:a7:c6:2d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:de:e2:a7:c6:2d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:92:6d:c1:f3:6e:26 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:25:07.782340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.782334 2546 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:25:07.782489 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.782456 2546 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:25:07.783554 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.783526 2546 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:25:07.783773 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.783556 2546 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-46.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:25:07.783863 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.783785 2546 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:25:07.783863 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.783798 2546 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:25:07.783863 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.783817 2546 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:25:07.784481 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.784470 2546 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:25:07.785756 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.785744 2546 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:25:07.785890 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.785879 2546 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:25:07.788763 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.788751 2546 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:25:07.788831 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.788770 2546 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:25:07.788831 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.788791 2546 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:25:07.788831 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.788806 2546 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:25:07.788831 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.788818 2546 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:25:07.789830 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.789816 2546 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:25:07.789894 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.789841 2546 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:25:07.792596 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.792579 2546 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:25:07.793886 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.793872 2546 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:25:07.795087 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.795071 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:25:07.795154 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.795094 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:25:07.795154 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.795101 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:25:07.795154 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.795108 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:25:07.795154 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.795114 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:25:07.795154 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.795119 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:25:07.795154 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.795126 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:25:07.795154 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.795132 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:25:07.795154 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.795140 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:25:07.795154 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.795149 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:25:07.795384 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.795160 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:25:07.795384 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.795169 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:25:07.796517 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.796506 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:25:07.796550 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.796519 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:25:07.800248 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.800234 2546 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:25:07.800337 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.800269 2546 server.go:1295] "Started kubelet" Apr 17 17:25:07.800395 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.800370 2546 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:25:07.800426 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.800364 2546 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:25:07.800454 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.800427 2546 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:25:07.801157 ip-10-0-137-46 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:25:07.803035 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.803016 2546 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:25:07.804303 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.804288 2546 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:25:07.805108 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.805090 2546 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-46.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:25:07.805202 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:07.805114 2546 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:25:07.805202 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:07.805126 2546 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-46.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:25:07.809214 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:07.809195 2546 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:25:07.810320 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.810297 2546 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:25:07.810828 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.810811 2546 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:25:07.811439 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.811423 2546 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:25:07.812336 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.812313 2546 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:25:07.812416 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.812341 2546 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:25:07.812416 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:07.812351 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-46.ec2.internal\" not found" Apr 17 17:25:07.812526 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.812514 2546 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:25:07.812576 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.812544 2546 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:25:07.812576 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.812549 2546 factory.go:55] Registering systemd factory Apr 17 17:25:07.812657 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.812643 2546 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:25:07.812876 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.812861 2546 factory.go:153] Registering CRI-O factory Apr 17 17:25:07.812944 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.812878 2546 factory.go:223] Registration of the crio container factory successfully Apr 17 17:25:07.812944 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.812930 2546 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:25:07.813042 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.812964 2546 factory.go:103] Registering Raw factory Apr 17 17:25:07.813042 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.812981 2546 manager.go:1196] Started watching for new ooms in manager Apr 17 17:25:07.813532 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.813511 2546 manager.go:319] Starting recovery of all containers Apr 17 17:25:07.821490 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:07.821301 2546 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 17:25:07.821604 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:07.821475 2546 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-46.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 17:25:07.821921 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.821896 2546 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cfspj" Apr 17 17:25:07.822144 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:07.821184 2546 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-46.ec2.internal.18a734de64d88857 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-46.ec2.internal,UID:ip-10-0-137-46.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-46.ec2.internal,},FirstTimestamp:2026-04-17 17:25:07.800246359 +0000 UTC m=+0.422527026,LastTimestamp:2026-04-17 17:25:07.800246359 +0000 UTC m=+0.422527026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-46.ec2.internal,}" Apr 17 17:25:07.825026 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.825006 2546 manager.go:324] Recovery completed Apr 17 17:25:07.826640 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:07.826604 2546 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 17:25:07.829708 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.829695 2546 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:07.832321 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.832306 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:07.832373 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.832337 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:07.832373 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.832350 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:07.832883 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.832867 2546 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:25:07.832883 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.832882 2546 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:25:07.832969 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.832897 2546 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:25:07.834395 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:07.834332 2546 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-46.ec2.internal.18a734de66c1fadb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-46.ec2.internal,UID:ip-10-0-137-46.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-137-46.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-137-46.ec2.internal,},FirstTimestamp:2026-04-17 17:25:07.832322779 +0000 UTC m=+0.454603445,LastTimestamp:2026-04-17 17:25:07.832322779 +0000 UTC m=+0.454603445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-46.ec2.internal,}" Apr 17 17:25:07.835212 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.835201 2546 policy_none.go:49] "None policy: Start" Apr 17 17:25:07.835255 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.835216 2546 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:25:07.835255 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.835225 2546 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:25:07.837544 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.837526 2546 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cfspj" Apr 17 17:25:07.869312 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.869293 2546 manager.go:341] "Starting Device Plugin manager" Apr 17 17:25:07.884028 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:07.869337 2546 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:25:07.884028 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.869367 2546 server.go:85] "Starting device plugin registration server" Apr 17 17:25:07.884028 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.869651 2546 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:25:07.884028 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.869666 2546 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:25:07.884028 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.869812 2546 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:25:07.884028 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.869903 2546 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:25:07.884028 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.869913 2546 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:25:07.884028 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:07.870432 2546 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:25:07.884028 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:07.870476 2546 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-46.ec2.internal\" not found" Apr 17 17:25:07.947552 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.947467 2546 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:25:07.948687 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.948658 2546 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:25:07.948749 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.948700 2546 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:25:07.948749 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.948723 2546 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:25:07.948749 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.948729 2546 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:25:07.948876 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:07.948765 2546 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:25:07.952529 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.952512 2546 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:07.969984 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.969970 2546 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:07.970947 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.970932 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:07.971015 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.970959 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:07.971015 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.970970 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:07.971015 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.970992 2546 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-46.ec2.internal" Apr 17 17:25:07.979210 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:07.979197 2546 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-46.ec2.internal" Apr 17 17:25:07.979258 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:07.979217 2546 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-46.ec2.internal\": node \"ip-10-0-137-46.ec2.internal\" not found" Apr 17 17:25:07.990801 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:07.990783 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-46.ec2.internal\" not found" Apr 17 17:25:08.049076 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.049047 2546 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-46.ec2.internal"] Apr 17 17:25:08.049140 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.049121 2546 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:08.050061 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.050046 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:08.050136 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.050074 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:08.050136 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.050090 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:08.051473 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.051460 2546 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:08.051637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.051624 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal" Apr 17 17:25:08.051687 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.051669 2546 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:08.052203 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.052189 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:08.052203 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.052196 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:08.052310 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.052214 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:08.052310 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.052229 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:08.052310 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.052216 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:08.052310 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.052262 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:08.053456 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.053443 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-46.ec2.internal" Apr 17 17:25:08.053510 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.053467 2546 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:08.054210 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.054194 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:08.054210 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.054213 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:08.054357 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.054224 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:08.083423 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:08.083401 2546 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-46.ec2.internal\" not found" node="ip-10-0-137-46.ec2.internal" Apr 17 17:25:08.086960 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:08.086942 2546 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-46.ec2.internal\" not found" node="ip-10-0-137-46.ec2.internal" Apr 17 17:25:08.090989 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:08.090974 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-46.ec2.internal\" not found" Apr 17 17:25:08.114879 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.114856 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/01fb9b7f7df95b8fab2f65114ee88783-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal\" (UID: \"01fb9b7f7df95b8fab2f65114ee88783\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal" Apr 17 17:25:08.114964 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.114884 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/01fb9b7f7df95b8fab2f65114ee88783-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal\" (UID: \"01fb9b7f7df95b8fab2f65114ee88783\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal" Apr 17 17:25:08.114964 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.114904 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bec88f0e450f3cfd4f0ac310ffbc3b96-config\") pod \"kube-apiserver-proxy-ip-10-0-137-46.ec2.internal\" (UID: \"bec88f0e450f3cfd4f0ac310ffbc3b96\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-46.ec2.internal" Apr 17 17:25:08.191175 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:08.191149 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-46.ec2.internal\" not found" Apr 17 17:25:08.215632 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.215575 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/01fb9b7f7df95b8fab2f65114ee88783-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal\" (UID: \"01fb9b7f7df95b8fab2f65114ee88783\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal" Apr 17 17:25:08.215632 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.215604 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bec88f0e450f3cfd4f0ac310ffbc3b96-config\") pod \"kube-apiserver-proxy-ip-10-0-137-46.ec2.internal\" (UID: \"bec88f0e450f3cfd4f0ac310ffbc3b96\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-46.ec2.internal" Apr 17 17:25:08.215632 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.215625 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/01fb9b7f7df95b8fab2f65114ee88783-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal\" (UID: \"01fb9b7f7df95b8fab2f65114ee88783\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal" Apr 17 17:25:08.215786 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.215672 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bec88f0e450f3cfd4f0ac310ffbc3b96-config\") pod \"kube-apiserver-proxy-ip-10-0-137-46.ec2.internal\" (UID: \"bec88f0e450f3cfd4f0ac310ffbc3b96\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-46.ec2.internal" Apr 17 17:25:08.215786 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.215711 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/01fb9b7f7df95b8fab2f65114ee88783-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal\" (UID: \"01fb9b7f7df95b8fab2f65114ee88783\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal" Apr 17 17:25:08.215786 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.215672 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/01fb9b7f7df95b8fab2f65114ee88783-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal\" (UID: \"01fb9b7f7df95b8fab2f65114ee88783\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal" Apr 17 17:25:08.291885 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:08.291848 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-46.ec2.internal\" not found" Apr 17 17:25:08.386398 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.386359 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal" Apr 17 17:25:08.389966 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.389950 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-46.ec2.internal" Apr 17 17:25:08.392472 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:08.392449 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-46.ec2.internal\" not found" Apr 17 17:25:08.492668 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:08.492576 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-46.ec2.internal\" not found" Apr 17 17:25:08.593082 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:08.593043 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-46.ec2.internal\" not found" Apr 17 17:25:08.693622 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:08.693590 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-46.ec2.internal\" not found" Apr 17 17:25:08.713934 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.713912 2546 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:25:08.714386 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.714035 2546 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:25:08.794701 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:08.794655 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-46.ec2.internal\" not found" Apr 17 17:25:08.810971 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.810946 2546 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:25:08.823373 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.823351 2546 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:25:08.839454 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.839425 2546 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:20:07 +0000 UTC" deadline="2027-11-02 11:50:45.383736016 +0000 UTC" Apr 17 17:25:08.839454 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.839449 2546 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13530h25m36.54428922s" Apr 17 17:25:08.847406 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.847387 2546 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jqlzv" Apr 17 17:25:08.855728 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.855659 2546 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jqlzv" Apr 17 17:25:08.864514 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:08.864481 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbec88f0e450f3cfd4f0ac310ffbc3b96.slice/crio-a7d3fcf92619b1331afd4a83efd5761f79b3e2dc13752086db402aba9ebea4cc WatchSource:0}: Error finding container a7d3fcf92619b1331afd4a83efd5761f79b3e2dc13752086db402aba9ebea4cc: Status 404 returned error can't find the container with id a7d3fcf92619b1331afd4a83efd5761f79b3e2dc13752086db402aba9ebea4cc Apr 17 17:25:08.864734 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:08.864714 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01fb9b7f7df95b8fab2f65114ee88783.slice/crio-043b8692062e003b82d023175aebd638cbdc5d7fff132dabff9e51dea0559d07 WatchSource:0}: Error finding container 043b8692062e003b82d023175aebd638cbdc5d7fff132dabff9e51dea0559d07: Status 404 returned error can't find the container with id 043b8692062e003b82d023175aebd638cbdc5d7fff132dabff9e51dea0559d07 Apr 17 17:25:08.870312 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.870285 2546 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:25:08.888487 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.888461 2546 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:08.895327 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:08.895307 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-46.ec2.internal\" not found" Apr 17 17:25:08.951643 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.951587 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal" event={"ID":"01fb9b7f7df95b8fab2f65114ee88783","Type":"ContainerStarted","Data":"043b8692062e003b82d023175aebd638cbdc5d7fff132dabff9e51dea0559d07"} Apr 17 17:25:08.952546 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:08.952526 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-46.ec2.internal" event={"ID":"bec88f0e450f3cfd4f0ac310ffbc3b96","Type":"ContainerStarted","Data":"a7d3fcf92619b1331afd4a83efd5761f79b3e2dc13752086db402aba9ebea4cc"} Apr 17 17:25:08.995691 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:08.995653 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-46.ec2.internal\" not found" Apr 17 17:25:09.002185 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.002165 2546 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:09.011992 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.011975 2546 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal" Apr 17 17:25:09.024613 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.024559 2546 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:25:09.025443 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.025431 2546 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-46.ec2.internal" Apr 17 17:25:09.034577 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.034560 2546 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:25:09.052961 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.052943 2546 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:09.789491 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.789455 2546 apiserver.go:52] "Watching apiserver" Apr 17 17:25:09.795692 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.795654 2546 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:25:09.796061 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.796039 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-sf4kc","openshift-cluster-node-tuning-operator/tuned-wpsq6","openshift-dns/node-resolver-r6rvm","openshift-image-registry/node-ca-2phxj","openshift-multus/multus-qz758","openshift-network-diagnostics/network-check-target-zgwbf","openshift-network-operator/iptables-alerter-b4h7q","openshift-ovn-kubernetes/ovnkube-node-ft44m","kube-system/kube-apiserver-proxy-ip-10-0-137-46.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal","openshift-multus/multus-additional-cni-plugins-q92kj","openshift-multus/network-metrics-daemon-fbmql"] Apr 17 17:25:09.798955 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.798934 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qz758" Apr 17 17:25:09.799954 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.799935 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.801076 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.801053 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2phxj" Apr 17 17:25:09.801894 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.801753 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:25:09.801894 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.801763 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:25:09.801894 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.801764 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:25:09.801894 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.801768 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-cjj9q\"" Apr 17 17:25:09.802194 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.802166 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:25:09.802300 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.802195 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-s5krx\"" Apr 17 17:25:09.802300 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.802174 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:25:09.802300 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.802173 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:25:09.802453 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.802173 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:25:09.803901 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.803880 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:25:09.804013 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.803983 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-sf4kc" Apr 17 17:25:09.804238 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.804221 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:25:09.804976 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.804956 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:25:09.805351 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.805329 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7lcdx\"" Apr 17 17:25:09.806024 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.805669 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r6rvm" Apr 17 17:25:09.806405 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.806385 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:25:09.806962 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.806944 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:25:09.807159 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.807141 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-nxwz8\"" Apr 17 17:25:09.807749 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.807728 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:09.807838 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:09.807812 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:09.808260 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.808169 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-b4h7q" Apr 17 17:25:09.808485 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.808465 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:25:09.808716 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.808695 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:25:09.808872 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.808849 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-nkq72\"" Apr 17 17:25:09.810622 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.810605 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.811400 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.811380 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:09.811488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.811409 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:25:09.811552 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.811530 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8fntx\"" Apr 17 17:25:09.811668 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.811607 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:09.811965 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.811949 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.813160 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.813137 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:25:09.813246 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.813216 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:25:09.813304 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.813254 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:25:09.813304 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.813267 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:25:09.813415 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.813336 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:25:09.813415 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.813391 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:25:09.813737 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.813646 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:09.813818 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.813796 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-65brh\"" Apr 17 17:25:09.813920 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.813886 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:09.814142 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.814124 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:09.814233 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.814215 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vnn4x\"" Apr 17 17:25:09.815241 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.815220 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:09.815339 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:09.815280 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:09.815974 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.815956 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mvbh9\"" Apr 17 17:25:09.816132 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.816071 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:25:09.816436 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.816371 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:25:09.823326 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823306 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-sysconfig\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.823407 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823341 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-run\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.823407 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823367 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e9676b31-a86d-4645-96b9-1bfa16b53a94-agent-certs\") pod \"konnectivity-agent-sf4kc\" (UID: \"e9676b31-a86d-4645-96b9-1bfa16b53a94\") " pod="kube-system/konnectivity-agent-sf4kc" Apr 17 17:25:09.823407 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823390 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e869ba13-1af3-46e4-bbaa-eef8b748f612-ovnkube-config\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.823556 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823414 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e869ba13-1af3-46e4-bbaa-eef8b748f612-ovnkube-script-lib\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.823556 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823438 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbxmr\" (UniqueName: \"kubernetes.io/projected/e869ba13-1af3-46e4-bbaa-eef8b748f612-kube-api-access-wbxmr\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.823556 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823461 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-multus-cni-dir\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.823556 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823483 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-cnibin\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.823556 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823514 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-run-netns\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.823556 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823545 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-var-lib-cni-multus\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.823797 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823575 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-multus-conf-dir\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.823797 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823617 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-sys-fs\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.823797 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823646 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2lsz\" (UniqueName: \"kubernetes.io/projected/c5ad42a7-476b-4a76-800f-f9c45c9f695f-kube-api-access-g2lsz\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.823797 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823668 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8b1e9f99-746c-4f28-80b9-ea9eb814cd98-serviceca\") pod \"node-ca-2phxj\" (UID: \"8b1e9f99-746c-4f28-80b9-ea9eb814cd98\") " pod="openshift-image-registry/node-ca-2phxj" Apr 17 17:25:09.823797 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823719 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/19f42a69-1efb-4887-94b9-58c87acfe319-iptables-alerter-script\") pod \"iptables-alerter-b4h7q\" (UID: \"19f42a69-1efb-4887-94b9-58c87acfe319\") " pod="openshift-network-operator/iptables-alerter-b4h7q" Apr 17 17:25:09.823797 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823743 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-system-cni-dir\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.823797 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823765 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-os-release\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.823797 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823781 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-device-dir\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.824120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823820 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-run-systemd\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.824120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823851 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-kubernetes\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.824120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823876 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-sysctl-d\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.824120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823898 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-slash\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.824120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823942 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-systemd-units\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.824120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823962 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-var-lib-openvswitch\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.824120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.823978 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-cni-bin\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.824120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824009 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e869ba13-1af3-46e4-bbaa-eef8b748f612-ovn-node-metrics-cert\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.824120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824033 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a2e022ed-9ba3-454c-9f27-4b300f2393d6-hosts-file\") pod \"node-resolver-r6rvm\" (UID: \"a2e022ed-9ba3-454c-9f27-4b300f2393d6\") " pod="openshift-dns/node-resolver-r6rvm" Apr 17 17:25:09.824120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824052 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-modprobe-d\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.824120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824066 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e9676b31-a86d-4645-96b9-1bfa16b53a94-konnectivity-ca\") pod \"konnectivity-agent-sf4kc\" (UID: \"e9676b31-a86d-4645-96b9-1bfa16b53a94\") " pod="kube-system/konnectivity-agent-sf4kc" Apr 17 17:25:09.824120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824082 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-hostroot\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.824120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824117 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wbkm\" (UniqueName: \"kubernetes.io/projected/8b1e9f99-746c-4f28-80b9-ea9eb814cd98-kube-api-access-5wbkm\") pod \"node-ca-2phxj\" (UID: \"8b1e9f99-746c-4f28-80b9-ea9eb814cd98\") " pod="openshift-image-registry/node-ca-2phxj" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824145 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-tuned\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824165 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6msgh\" (UniqueName: \"kubernetes.io/projected/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-kube-api-access-6msgh\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824187 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e869ba13-1af3-46e4-bbaa-eef8b748f612-env-overrides\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824208 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-systemd\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824222 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-node-log\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824245 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-cni-netd\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824268 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-run-k8s-cni-cncf-io\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824295 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-run-ovn-kubernetes\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824319 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-multus-socket-dir-parent\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824342 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-etc-kubernetes\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824364 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-var-lib-kubelet\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824386 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824434 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-host\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824469 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2e022ed-9ba3-454c-9f27-4b300f2393d6-tmp-dir\") pod \"node-resolver-r6rvm\" (UID: \"a2e022ed-9ba3-454c-9f27-4b300f2393d6\") " pod="openshift-dns/node-resolver-r6rvm" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824497 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7l9m\" (UniqueName: \"kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m\") pod \"network-check-target-zgwbf\" (UID: \"fa8cdfa0-8080-411d-bd6e-51b977229392\") " pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824522 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-run-ovn\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.824705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824545 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-log-socket\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824567 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-cni-binary-copy\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824592 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-var-lib-cni-bin\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824615 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-registration-dir\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824640 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824664 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b1e9f99-746c-4f28-80b9-ea9eb814cd98-host\") pod \"node-ca-2phxj\" (UID: \"8b1e9f99-746c-4f28-80b9-ea9eb814cd98\") " pod="openshift-image-registry/node-ca-2phxj" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824711 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-sys\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824742 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-var-lib-kubelet\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824766 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm8j5\" (UniqueName: \"kubernetes.io/projected/19f42a69-1efb-4887-94b9-58c87acfe319-kube-api-access-lm8j5\") pod \"iptables-alerter-b4h7q\" (UID: \"19f42a69-1efb-4887-94b9-58c87acfe319\") " pod="openshift-network-operator/iptables-alerter-b4h7q" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824792 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-kubelet\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824810 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-run-netns\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824840 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-run-multus-certs\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824872 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4gbm\" (UniqueName: \"kubernetes.io/projected/a2e022ed-9ba3-454c-9f27-4b300f2393d6-kube-api-access-p4gbm\") pod \"node-resolver-r6rvm\" (UID: \"a2e022ed-9ba3-454c-9f27-4b300f2393d6\") " pod="openshift-dns/node-resolver-r6rvm" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824889 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-sysctl-conf\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824912 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-tmp\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824933 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkcwq\" (UniqueName: \"kubernetes.io/projected/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-kube-api-access-mkcwq\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.825372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824957 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-lib-modules\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.825914 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824979 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19f42a69-1efb-4887-94b9-58c87acfe319-host-slash\") pod \"iptables-alerter-b4h7q\" (UID: \"19f42a69-1efb-4887-94b9-58c87acfe319\") " pod="openshift-network-operator/iptables-alerter-b4h7q" Apr 17 17:25:09.825914 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.824997 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-run-openvswitch\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.825914 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.825012 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-etc-openvswitch\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.825914 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.825027 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-multus-daemon-config\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.825914 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.825090 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-socket-dir\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.825914 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.825130 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-etc-selinux\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.856569 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.856544 2546 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:20:08 +0000 UTC" deadline="2027-11-15 04:13:20.719822078 +0000 UTC" Apr 17 17:25:09.856569 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.856569 2546 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13834h48m10.863256548s" Apr 17 17:25:09.912145 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.912118 2546 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:25:09.926109 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926075 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e869ba13-1af3-46e4-bbaa-eef8b748f612-env-overrides\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.926289 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926121 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-systemd\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.926289 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926148 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-node-log\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.926289 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926176 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-cni-netd\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.926289 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926193 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-run-k8s-cni-cncf-io\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.926289 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926224 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/17cc3e64-292f-4b71-9d6f-6deb75cffce6-cnibin\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:09.926289 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926231 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-systemd\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.926289 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926250 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk8vt\" (UniqueName: \"kubernetes.io/projected/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-kube-api-access-wk8vt\") pod \"network-metrics-daemon-fbmql\" (UID: \"173598bb-6dcc-46e9-a78f-f3d5c1fd4297\") " pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926293 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-run-ovn-kubernetes\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926292 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-cni-netd\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926304 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-run-k8s-cni-cncf-io\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926337 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-run-ovn-kubernetes\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926358 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-node-log\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926366 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-multus-socket-dir-parent\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926405 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-multus-socket-dir-parent\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926407 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-etc-kubernetes\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926436 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-var-lib-kubelet\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926447 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-etc-kubernetes\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926463 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926477 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-var-lib-kubelet\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926487 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-host\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926507 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926513 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2e022ed-9ba3-454c-9f27-4b300f2393d6-tmp-dir\") pod \"node-resolver-r6rvm\" (UID: \"a2e022ed-9ba3-454c-9f27-4b300f2393d6\") " pod="openshift-dns/node-resolver-r6rvm" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926538 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-host\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926541 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7l9m\" (UniqueName: \"kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m\") pod \"network-check-target-zgwbf\" (UID: \"fa8cdfa0-8080-411d-bd6e-51b977229392\") " pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:09.926627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926567 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-run-ovn\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926590 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-log-socket\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926615 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-cni-binary-copy\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926638 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-var-lib-cni-bin\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926661 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-registration-dir\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926707 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926733 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b1e9f99-746c-4f28-80b9-ea9eb814cd98-host\") pod \"node-ca-2phxj\" (UID: \"8b1e9f99-746c-4f28-80b9-ea9eb814cd98\") " pod="openshift-image-registry/node-ca-2phxj" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926757 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-sys\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926774 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-log-socket\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926794 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-var-lib-cni-bin\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926812 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b1e9f99-746c-4f28-80b9-ea9eb814cd98-host\") pod \"node-ca-2phxj\" (UID: \"8b1e9f99-746c-4f28-80b9-ea9eb814cd98\") " pod="openshift-image-registry/node-ca-2phxj" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926818 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2e022ed-9ba3-454c-9f27-4b300f2393d6-tmp-dir\") pod \"node-resolver-r6rvm\" (UID: \"a2e022ed-9ba3-454c-9f27-4b300f2393d6\") " pod="openshift-dns/node-resolver-r6rvm" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926820 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926840 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-run-ovn\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926878 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-registration-dir\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926882 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-var-lib-kubelet\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926885 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-sys\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926915 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm8j5\" (UniqueName: \"kubernetes.io/projected/19f42a69-1efb-4887-94b9-58c87acfe319-kube-api-access-lm8j5\") pod \"iptables-alerter-b4h7q\" (UID: \"19f42a69-1efb-4887-94b9-58c87acfe319\") " pod="openshift-network-operator/iptables-alerter-b4h7q" Apr 17 17:25:09.927488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926927 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-var-lib-kubelet\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926945 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-kubelet\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926969 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-run-netns\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.926995 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-run-multus-certs\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927019 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4gbm\" (UniqueName: \"kubernetes.io/projected/a2e022ed-9ba3-454c-9f27-4b300f2393d6-kube-api-access-p4gbm\") pod \"node-resolver-r6rvm\" (UID: \"a2e022ed-9ba3-454c-9f27-4b300f2393d6\") " pod="openshift-dns/node-resolver-r6rvm" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927044 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-sysctl-conf\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927047 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-run-netns\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927049 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-run-multus-certs\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927069 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-tmp\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927097 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkcwq\" (UniqueName: \"kubernetes.io/projected/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-kube-api-access-mkcwq\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927107 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-kubelet\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927123 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-lib-modules\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927146 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19f42a69-1efb-4887-94b9-58c87acfe319-host-slash\") pod \"iptables-alerter-b4h7q\" (UID: \"19f42a69-1efb-4887-94b9-58c87acfe319\") " pod="openshift-network-operator/iptables-alerter-b4h7q" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927172 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-run-openvswitch\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927221 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-run-openvswitch\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927267 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19f42a69-1efb-4887-94b9-58c87acfe319-host-slash\") pod \"iptables-alerter-b4h7q\" (UID: \"19f42a69-1efb-4887-94b9-58c87acfe319\") " pod="openshift-network-operator/iptables-alerter-b4h7q" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927299 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-lib-modules\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927335 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs\") pod \"network-metrics-daemon-fbmql\" (UID: \"173598bb-6dcc-46e9-a78f-f3d5c1fd4297\") " pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:09.928340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927366 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-etc-openvswitch\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927378 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-sysctl-conf\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927391 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-multus-daemon-config\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927390 2546 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927423 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-cni-binary-copy\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927433 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-socket-dir\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927463 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-etc-selinux\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927503 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/17cc3e64-292f-4b71-9d6f-6deb75cffce6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927511 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-etc-openvswitch\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927531 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/17cc3e64-292f-4b71-9d6f-6deb75cffce6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927558 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-socket-dir\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927560 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-sysconfig\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927601 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-sysconfig\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927607 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-etc-selinux\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927603 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-run\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927640 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-run\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927648 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e9676b31-a86d-4645-96b9-1bfa16b53a94-agent-certs\") pod \"konnectivity-agent-sf4kc\" (UID: \"e9676b31-a86d-4645-96b9-1bfa16b53a94\") " pod="kube-system/konnectivity-agent-sf4kc" Apr 17 17:25:09.929257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927688 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e869ba13-1af3-46e4-bbaa-eef8b748f612-ovnkube-config\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927785 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e869ba13-1af3-46e4-bbaa-eef8b748f612-ovnkube-script-lib\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927820 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbxmr\" (UniqueName: \"kubernetes.io/projected/e869ba13-1af3-46e4-bbaa-eef8b748f612-kube-api-access-wbxmr\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927846 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-multus-cni-dir\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927871 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-cnibin\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927897 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-run-netns\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927920 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-var-lib-cni-multus\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927926 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e869ba13-1af3-46e4-bbaa-eef8b748f612-env-overrides\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927945 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-multus-conf-dir\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928004 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-sys-fs\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928032 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2lsz\" (UniqueName: \"kubernetes.io/projected/c5ad42a7-476b-4a76-800f-f9c45c9f695f-kube-api-access-g2lsz\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928060 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8b1e9f99-746c-4f28-80b9-ea9eb814cd98-serviceca\") pod \"node-ca-2phxj\" (UID: \"8b1e9f99-746c-4f28-80b9-ea9eb814cd98\") " pod="openshift-image-registry/node-ca-2phxj" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928084 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/19f42a69-1efb-4887-94b9-58c87acfe319-iptables-alerter-script\") pod \"iptables-alerter-b4h7q\" (UID: \"19f42a69-1efb-4887-94b9-58c87acfe319\") " pod="openshift-network-operator/iptables-alerter-b4h7q" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928088 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-run-netns\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928112 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-system-cni-dir\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928143 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/17cc3e64-292f-4b71-9d6f-6deb75cffce6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928150 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-multus-cni-dir\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.930075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.927847 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-multus-daemon-config\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928170 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-os-release\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928172 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e869ba13-1af3-46e4-bbaa-eef8b748f612-ovnkube-config\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928197 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-device-dir\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928205 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-cnibin\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928224 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/17cc3e64-292f-4b71-9d6f-6deb75cffce6-os-release\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928251 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-host-var-lib-cni-multus\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928253 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-run-systemd\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928284 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-kubernetes\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928298 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-multus-conf-dir\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928309 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-sysctl-d\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928335 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-slash\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928354 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-sys-fs\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928387 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-systemd-units\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928417 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-var-lib-openvswitch\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928433 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-systemd-units\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928445 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-cni-bin\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928485 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-var-lib-openvswitch\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.930889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928485 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e869ba13-1af3-46e4-bbaa-eef8b748f612-ovn-node-metrics-cert\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928513 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8b1e9f99-746c-4f28-80b9-ea9eb814cd98-serviceca\") pod \"node-ca-2phxj\" (UID: \"8b1e9f99-746c-4f28-80b9-ea9eb814cd98\") " pod="openshift-image-registry/node-ca-2phxj" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928529 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a2e022ed-9ba3-454c-9f27-4b300f2393d6-hosts-file\") pod \"node-resolver-r6rvm\" (UID: \"a2e022ed-9ba3-454c-9f27-4b300f2393d6\") " pod="openshift-dns/node-resolver-r6rvm" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928558 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-modprobe-d\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928584 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e9676b31-a86d-4645-96b9-1bfa16b53a94-konnectivity-ca\") pod \"konnectivity-agent-sf4kc\" (UID: \"e9676b31-a86d-4645-96b9-1bfa16b53a94\") " pod="kube-system/konnectivity-agent-sf4kc" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928609 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-hostroot\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928610 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-run-systemd\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928637 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5wbkm\" (UniqueName: \"kubernetes.io/projected/8b1e9f99-746c-4f28-80b9-ea9eb814cd98-kube-api-access-5wbkm\") pod \"node-ca-2phxj\" (UID: \"8b1e9f99-746c-4f28-80b9-ea9eb814cd98\") " pod="openshift-image-registry/node-ca-2phxj" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928665 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-tuned\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928670 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-kubernetes\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928708 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6msgh\" (UniqueName: \"kubernetes.io/projected/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-kube-api-access-6msgh\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928762 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17cc3e64-292f-4b71-9d6f-6deb75cffce6-system-cni-dir\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928782 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a2e022ed-9ba3-454c-9f27-4b300f2393d6-hosts-file\") pod \"node-resolver-r6rvm\" (UID: \"a2e022ed-9ba3-454c-9f27-4b300f2393d6\") " pod="openshift-dns/node-resolver-r6rvm" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928791 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/17cc3e64-292f-4b71-9d6f-6deb75cffce6-cni-binary-copy\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928818 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8s6h\" (UniqueName: \"kubernetes.io/projected/17cc3e64-292f-4b71-9d6f-6deb75cffce6-kube-api-access-t8s6h\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.928831 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c5ad42a7-476b-4a76-800f-f9c45c9f695f-device-dir\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.929009 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/19f42a69-1efb-4887-94b9-58c87acfe319-iptables-alerter-script\") pod \"iptables-alerter-b4h7q\" (UID: \"19f42a69-1efb-4887-94b9-58c87acfe319\") " pod="openshift-network-operator/iptables-alerter-b4h7q" Apr 17 17:25:09.931637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.929075 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-system-cni-dir\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.932319 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.929077 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-modprobe-d\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.932319 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.929123 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-slash\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.932319 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.929148 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-os-release\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.932319 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.929182 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-hostroot\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.932319 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.929183 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e869ba13-1af3-46e4-bbaa-eef8b748f612-host-cni-bin\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.932319 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.929184 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-sysctl-d\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.932319 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.929370 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e869ba13-1af3-46e4-bbaa-eef8b748f612-ovnkube-script-lib\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.932319 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.929639 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e9676b31-a86d-4645-96b9-1bfa16b53a94-konnectivity-ca\") pod \"konnectivity-agent-sf4kc\" (UID: \"e9676b31-a86d-4645-96b9-1bfa16b53a94\") " pod="kube-system/konnectivity-agent-sf4kc" Apr 17 17:25:09.932319 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.931849 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-tmp\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.932319 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.931922 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e869ba13-1af3-46e4-bbaa-eef8b748f612-ovn-node-metrics-cert\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:09.932319 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.932051 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-etc-tuned\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.932319 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.932221 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e9676b31-a86d-4645-96b9-1bfa16b53a94-agent-certs\") pod \"konnectivity-agent-sf4kc\" (UID: \"e9676b31-a86d-4645-96b9-1bfa16b53a94\") " pod="kube-system/konnectivity-agent-sf4kc" Apr 17 17:25:09.935748 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:09.935715 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:09.935748 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:09.935741 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:09.935748 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:09.935754 2546 projected.go:194] Error preparing data for projected volume kube-api-access-n7l9m for pod openshift-network-diagnostics/network-check-target-zgwbf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:09.935985 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:09.935833 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m podName:fa8cdfa0-8080-411d-bd6e-51b977229392 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:10.435813813 +0000 UTC m=+3.058094468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-n7l9m" (UniqueName: "kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m") pod "network-check-target-zgwbf" (UID: "fa8cdfa0-8080-411d-bd6e-51b977229392") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:09.938159 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.938139 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4gbm\" (UniqueName: \"kubernetes.io/projected/a2e022ed-9ba3-454c-9f27-4b300f2393d6-kube-api-access-p4gbm\") pod \"node-resolver-r6rvm\" (UID: \"a2e022ed-9ba3-454c-9f27-4b300f2393d6\") " pod="openshift-dns/node-resolver-r6rvm" Apr 17 17:25:09.938276 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.938221 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkcwq\" (UniqueName: \"kubernetes.io/projected/53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f-kube-api-access-mkcwq\") pod \"tuned-wpsq6\" (UID: \"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f\") " pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:09.938559 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.938536 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm8j5\" (UniqueName: \"kubernetes.io/projected/19f42a69-1efb-4887-94b9-58c87acfe319-kube-api-access-lm8j5\") pod \"iptables-alerter-b4h7q\" (UID: \"19f42a69-1efb-4887-94b9-58c87acfe319\") " pod="openshift-network-operator/iptables-alerter-b4h7q" Apr 17 17:25:09.943135 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.943101 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2lsz\" (UniqueName: \"kubernetes.io/projected/c5ad42a7-476b-4a76-800f-f9c45c9f695f-kube-api-access-g2lsz\") pod \"aws-ebs-csi-driver-node-z7qzz\" (UID: \"c5ad42a7-476b-4a76-800f-f9c45c9f695f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:09.945415 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.944188 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6msgh\" (UniqueName: \"kubernetes.io/projected/7eb9cbde-0842-4e38-ab1b-0c93d220e92a-kube-api-access-6msgh\") pod \"multus-qz758\" (UID: \"7eb9cbde-0842-4e38-ab1b-0c93d220e92a\") " pod="openshift-multus/multus-qz758" Apr 17 17:25:09.945871 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.945851 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wbkm\" (UniqueName: \"kubernetes.io/projected/8b1e9f99-746c-4f28-80b9-ea9eb814cd98-kube-api-access-5wbkm\") pod \"node-ca-2phxj\" (UID: \"8b1e9f99-746c-4f28-80b9-ea9eb814cd98\") " pod="openshift-image-registry/node-ca-2phxj" Apr 17 17:25:09.950579 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:09.950558 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbxmr\" (UniqueName: \"kubernetes.io/projected/e869ba13-1af3-46e4-bbaa-eef8b748f612-kube-api-access-wbxmr\") pod \"ovnkube-node-ft44m\" (UID: \"e869ba13-1af3-46e4-bbaa-eef8b748f612\") " pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:10.029911 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.029870 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17cc3e64-292f-4b71-9d6f-6deb75cffce6-system-cni-dir\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.030069 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.029920 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/17cc3e64-292f-4b71-9d6f-6deb75cffce6-cni-binary-copy\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.030069 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.029998 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17cc3e64-292f-4b71-9d6f-6deb75cffce6-system-cni-dir\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.030069 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.030046 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8s6h\" (UniqueName: \"kubernetes.io/projected/17cc3e64-292f-4b71-9d6f-6deb75cffce6-kube-api-access-t8s6h\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.030332 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.030083 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/17cc3e64-292f-4b71-9d6f-6deb75cffce6-cnibin\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.030332 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.030106 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wk8vt\" (UniqueName: \"kubernetes.io/projected/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-kube-api-access-wk8vt\") pod \"network-metrics-daemon-fbmql\" (UID: \"173598bb-6dcc-46e9-a78f-f3d5c1fd4297\") " pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:10.030332 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.030164 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/17cc3e64-292f-4b71-9d6f-6deb75cffce6-cnibin\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.030332 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.030173 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs\") pod \"network-metrics-daemon-fbmql\" (UID: \"173598bb-6dcc-46e9-a78f-f3d5c1fd4297\") " pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:10.030332 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.030228 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/17cc3e64-292f-4b71-9d6f-6deb75cffce6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.030332 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.030257 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/17cc3e64-292f-4b71-9d6f-6deb75cffce6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.030332 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:10.030272 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:10.030332 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.030304 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/17cc3e64-292f-4b71-9d6f-6deb75cffce6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.030332 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.030335 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/17cc3e64-292f-4b71-9d6f-6deb75cffce6-os-release\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.030752 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:10.030363 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs podName:173598bb-6dcc-46e9-a78f-f3d5c1fd4297 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:10.530344655 +0000 UTC m=+3.152625327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs") pod "network-metrics-daemon-fbmql" (UID: "173598bb-6dcc-46e9-a78f-f3d5c1fd4297") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:10.030752 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.030444 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/17cc3e64-292f-4b71-9d6f-6deb75cffce6-os-release\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.030752 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.030593 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/17cc3e64-292f-4b71-9d6f-6deb75cffce6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.030752 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.030608 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/17cc3e64-292f-4b71-9d6f-6deb75cffce6-cni-binary-copy\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.030961 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.030777 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/17cc3e64-292f-4b71-9d6f-6deb75cffce6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.030961 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.030929 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/17cc3e64-292f-4b71-9d6f-6deb75cffce6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.039258 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.039232 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk8vt\" (UniqueName: \"kubernetes.io/projected/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-kube-api-access-wk8vt\") pod \"network-metrics-daemon-fbmql\" (UID: \"173598bb-6dcc-46e9-a78f-f3d5c1fd4297\") " pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:10.039387 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.039270 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8s6h\" (UniqueName: \"kubernetes.io/projected/17cc3e64-292f-4b71-9d6f-6deb75cffce6-kube-api-access-t8s6h\") pod \"multus-additional-cni-plugins-q92kj\" (UID: \"17cc3e64-292f-4b71-9d6f-6deb75cffce6\") " pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.111297 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.111202 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qz758" Apr 17 17:25:10.119087 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.119063 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" Apr 17 17:25:10.126637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.126619 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2phxj" Apr 17 17:25:10.131246 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.131226 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-sf4kc" Apr 17 17:25:10.138764 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.138745 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r6rvm" Apr 17 17:25:10.146294 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.146267 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-b4h7q" Apr 17 17:25:10.152899 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.152879 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:10.159452 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.159433 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" Apr 17 17:25:10.164992 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.164965 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q92kj" Apr 17 17:25:10.276235 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.276207 2546 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:10.492972 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:10.492880 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9676b31_a86d_4645_96b9_1bfa16b53a94.slice/crio-da8ae6565fbd35ed2a60378785561bdc320e8166276f81de04282978b417ac7d WatchSource:0}: Error finding container da8ae6565fbd35ed2a60378785561bdc320e8166276f81de04282978b417ac7d: Status 404 returned error can't find the container with id da8ae6565fbd35ed2a60378785561bdc320e8166276f81de04282978b417ac7d Apr 17 17:25:10.494947 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:10.494911 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5ad42a7_476b_4a76_800f_f9c45c9f695f.slice/crio-a5ea84d035a66c66e1abd5122ac42998a82e2cbd662250534bbd34ff1ae07f42 WatchSource:0}: Error finding container a5ea84d035a66c66e1abd5122ac42998a82e2cbd662250534bbd34ff1ae07f42: Status 404 returned error can't find the container with id a5ea84d035a66c66e1abd5122ac42998a82e2cbd662250534bbd34ff1ae07f42 Apr 17 17:25:10.495592 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:10.495502 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eb9cbde_0842_4e38_ab1b_0c93d220e92a.slice/crio-0a02dc07536da3dae9a4e09c275bfb0f31af4f3112a2c745e26ced7237591b73 WatchSource:0}: Error finding container 0a02dc07536da3dae9a4e09c275bfb0f31af4f3112a2c745e26ced7237591b73: Status 404 returned error can't find the container with id 0a02dc07536da3dae9a4e09c275bfb0f31af4f3112a2c745e26ced7237591b73 Apr 17 17:25:10.497991 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:10.497975 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b1e9f99_746c_4f28_80b9_ea9eb814cd98.slice/crio-72b81f4c467de1964b39ad1c40032a4abce2167be3f1459767f8d356862c1a7d WatchSource:0}: Error finding container 72b81f4c467de1964b39ad1c40032a4abce2167be3f1459767f8d356862c1a7d: Status 404 returned error can't find the container with id 72b81f4c467de1964b39ad1c40032a4abce2167be3f1459767f8d356862c1a7d Apr 17 17:25:10.498986 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:10.498904 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2e022ed_9ba3_454c_9f27_4b300f2393d6.slice/crio-2b778250c6a5e8f05c8a267e7e364fce20969e3d30d57b9884d7c80bf426f3d1 WatchSource:0}: Error finding container 2b778250c6a5e8f05c8a267e7e364fce20969e3d30d57b9884d7c80bf426f3d1: Status 404 returned error can't find the container with id 2b778250c6a5e8f05c8a267e7e364fce20969e3d30d57b9884d7c80bf426f3d1 Apr 17 17:25:10.500546 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:10.500326 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17cc3e64_292f_4b71_9d6f_6deb75cffce6.slice/crio-6dfa30ce9d1697a5e489db797a1c8ae7a8b957fa225c8f882fb224770e96949b WatchSource:0}: Error finding container 6dfa30ce9d1697a5e489db797a1c8ae7a8b957fa225c8f882fb224770e96949b: Status 404 returned error can't find the container with id 6dfa30ce9d1697a5e489db797a1c8ae7a8b957fa225c8f882fb224770e96949b Apr 17 17:25:10.500967 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:10.500860 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53031ab0_20d5_45d9_8cf7_e2d6fc9ec15f.slice/crio-a182de2a1116d3bc9b0c9f862d6a7ecada054f5f4b0c7c7bff085661ea989750 WatchSource:0}: Error finding container a182de2a1116d3bc9b0c9f862d6a7ecada054f5f4b0c7c7bff085661ea989750: Status 404 returned error can't find the container with id a182de2a1116d3bc9b0c9f862d6a7ecada054f5f4b0c7c7bff085661ea989750 Apr 17 17:25:10.502630 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:10.502607 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19f42a69_1efb_4887_94b9_58c87acfe319.slice/crio-cb778af251f52194e8ed894640b0fcefc8e6898e93da060b3468b1cc60128afd WatchSource:0}: Error finding container cb778af251f52194e8ed894640b0fcefc8e6898e93da060b3468b1cc60128afd: Status 404 returned error can't find the container with id cb778af251f52194e8ed894640b0fcefc8e6898e93da060b3468b1cc60128afd Apr 17 17:25:10.503653 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:25:10.503632 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode869ba13_1af3_46e4_bbaa_eef8b748f612.slice/crio-e8c06dee3472c38cee2e5f6cbb868a04506e6d5897989e9d45ab3ec45f0ffe95 WatchSource:0}: Error finding container e8c06dee3472c38cee2e5f6cbb868a04506e6d5897989e9d45ab3ec45f0ffe95: Status 404 returned error can't find the container with id e8c06dee3472c38cee2e5f6cbb868a04506e6d5897989e9d45ab3ec45f0ffe95 Apr 17 17:25:10.533724 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.533595 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7l9m\" (UniqueName: \"kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m\") pod \"network-check-target-zgwbf\" (UID: \"fa8cdfa0-8080-411d-bd6e-51b977229392\") " pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:10.533831 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.533763 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs\") pod \"network-metrics-daemon-fbmql\" (UID: \"173598bb-6dcc-46e9-a78f-f3d5c1fd4297\") " pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:10.533831 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:10.533785 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:10.533831 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:10.533807 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:10.533831 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:10.533820 2546 projected.go:194] Error preparing data for projected volume kube-api-access-n7l9m for pod openshift-network-diagnostics/network-check-target-zgwbf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:10.533995 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:10.533867 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:10.533995 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:10.533883 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m podName:fa8cdfa0-8080-411d-bd6e-51b977229392 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:11.533863682 +0000 UTC m=+4.156144339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7l9m" (UniqueName: "kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m") pod "network-check-target-zgwbf" (UID: "fa8cdfa0-8080-411d-bd6e-51b977229392") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:10.533995 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:10.533926 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs podName:173598bb-6dcc-46e9-a78f-f3d5c1fd4297 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:11.533917778 +0000 UTC m=+4.156198431 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs") pod "network-metrics-daemon-fbmql" (UID: "173598bb-6dcc-46e9-a78f-f3d5c1fd4297") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:10.857925 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.857561 2546 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:20:08 +0000 UTC" deadline="2027-10-24 02:34:50.287121984 +0000 UTC" Apr 17 17:25:10.857925 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.857599 2546 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13305h9m39.429527075s" Apr 17 17:25:10.948989 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.948952 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:10.949162 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:10.949106 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:10.960210 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.960175 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-46.ec2.internal" event={"ID":"bec88f0e450f3cfd4f0ac310ffbc3b96","Type":"ContainerStarted","Data":"d242d75abb366bef7ac2e9aa6be5c8696fd988002b3c3b8648225aa8c9f0e4de"} Apr 17 17:25:10.974241 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.974169 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-b4h7q" event={"ID":"19f42a69-1efb-4887-94b9-58c87acfe319","Type":"ContainerStarted","Data":"cb778af251f52194e8ed894640b0fcefc8e6898e93da060b3468b1cc60128afd"} Apr 17 17:25:10.975736 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.975656 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" event={"ID":"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f","Type":"ContainerStarted","Data":"a182de2a1116d3bc9b0c9f862d6a7ecada054f5f4b0c7c7bff085661ea989750"} Apr 17 17:25:10.977750 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.977051 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-46.ec2.internal" podStartSLOduration=1.977035217 podStartE2EDuration="1.977035217s" podCreationTimestamp="2026-04-17 17:25:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:10.976821345 +0000 UTC m=+3.599102020" watchObservedRunningTime="2026-04-17 17:25:10.977035217 +0000 UTC m=+3.599315893" Apr 17 17:25:10.977917 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.977896 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r6rvm" event={"ID":"a2e022ed-9ba3-454c-9f27-4b300f2393d6","Type":"ContainerStarted","Data":"2b778250c6a5e8f05c8a267e7e364fce20969e3d30d57b9884d7c80bf426f3d1"} Apr 17 17:25:10.983523 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.983494 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2phxj" event={"ID":"8b1e9f99-746c-4f28-80b9-ea9eb814cd98","Type":"ContainerStarted","Data":"72b81f4c467de1964b39ad1c40032a4abce2167be3f1459767f8d356862c1a7d"} Apr 17 17:25:10.985850 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.985814 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qz758" event={"ID":"7eb9cbde-0842-4e38-ab1b-0c93d220e92a","Type":"ContainerStarted","Data":"0a02dc07536da3dae9a4e09c275bfb0f31af4f3112a2c745e26ced7237591b73"} Apr 17 17:25:10.991382 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.990177 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" event={"ID":"c5ad42a7-476b-4a76-800f-f9c45c9f695f","Type":"ContainerStarted","Data":"a5ea84d035a66c66e1abd5122ac42998a82e2cbd662250534bbd34ff1ae07f42"} Apr 17 17:25:10.998329 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.995511 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" event={"ID":"e869ba13-1af3-46e4-bbaa-eef8b748f612","Type":"ContainerStarted","Data":"e8c06dee3472c38cee2e5f6cbb868a04506e6d5897989e9d45ab3ec45f0ffe95"} Apr 17 17:25:10.999997 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:10.999973 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q92kj" event={"ID":"17cc3e64-292f-4b71-9d6f-6deb75cffce6","Type":"ContainerStarted","Data":"6dfa30ce9d1697a5e489db797a1c8ae7a8b957fa225c8f882fb224770e96949b"} Apr 17 17:25:11.007349 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:11.007318 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-sf4kc" event={"ID":"e9676b31-a86d-4645-96b9-1bfa16b53a94","Type":"ContainerStarted","Data":"da8ae6565fbd35ed2a60378785561bdc320e8166276f81de04282978b417ac7d"} Apr 17 17:25:11.540545 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:11.540506 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7l9m\" (UniqueName: \"kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m\") pod \"network-check-target-zgwbf\" (UID: \"fa8cdfa0-8080-411d-bd6e-51b977229392\") " pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:11.540722 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:11.540572 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs\") pod \"network-metrics-daemon-fbmql\" (UID: \"173598bb-6dcc-46e9-a78f-f3d5c1fd4297\") " pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:11.540797 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:11.540745 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:11.540853 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:11.540811 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs podName:173598bb-6dcc-46e9-a78f-f3d5c1fd4297 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:13.540791725 +0000 UTC m=+6.163072382 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs") pod "network-metrics-daemon-fbmql" (UID: "173598bb-6dcc-46e9-a78f-f3d5c1fd4297") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:11.541250 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:11.541230 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:11.541352 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:11.541256 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:11.541352 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:11.541269 2546 projected.go:194] Error preparing data for projected volume kube-api-access-n7l9m for pod openshift-network-diagnostics/network-check-target-zgwbf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:11.541352 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:11.541323 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m podName:fa8cdfa0-8080-411d-bd6e-51b977229392 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:13.541307922 +0000 UTC m=+6.163588584 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7l9m" (UniqueName: "kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m") pod "network-check-target-zgwbf" (UID: "fa8cdfa0-8080-411d-bd6e-51b977229392") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:11.952370 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:11.951790 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:11.952370 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:11.951911 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:12.036938 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:12.036896 2546 generic.go:358] "Generic (PLEG): container finished" podID="01fb9b7f7df95b8fab2f65114ee88783" containerID="e7787c2521471ec3677c9dc172c3b97025767aec6af32400bbe3f98274dabf22" exitCode=0 Apr 17 17:25:12.037449 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:12.037394 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal" event={"ID":"01fb9b7f7df95b8fab2f65114ee88783","Type":"ContainerDied","Data":"e7787c2521471ec3677c9dc172c3b97025767aec6af32400bbe3f98274dabf22"} Apr 17 17:25:12.950035 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:12.949999 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:12.950227 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:12.950150 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:13.044706 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:13.044507 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal" event={"ID":"01fb9b7f7df95b8fab2f65114ee88783","Type":"ContainerStarted","Data":"d9e215198a6d7febf60a6d6d6ba95c4d25effd4f7b8cf01b378614e662887e4f"} Apr 17 17:25:13.559892 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:13.559367 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7l9m\" (UniqueName: \"kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m\") pod \"network-check-target-zgwbf\" (UID: \"fa8cdfa0-8080-411d-bd6e-51b977229392\") " pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:13.559892 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:13.559417 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs\") pod \"network-metrics-daemon-fbmql\" (UID: \"173598bb-6dcc-46e9-a78f-f3d5c1fd4297\") " pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:13.559892 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:13.559511 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:13.559892 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:13.559572 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs podName:173598bb-6dcc-46e9-a78f-f3d5c1fd4297 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:17.559554267 +0000 UTC m=+10.181834927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs") pod "network-metrics-daemon-fbmql" (UID: "173598bb-6dcc-46e9-a78f-f3d5c1fd4297") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:13.559892 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:13.559660 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:13.559892 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:13.559695 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:13.559892 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:13.559708 2546 projected.go:194] Error preparing data for projected volume kube-api-access-n7l9m for pod openshift-network-diagnostics/network-check-target-zgwbf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:13.559892 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:13.559756 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m podName:fa8cdfa0-8080-411d-bd6e-51b977229392 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:17.559742143 +0000 UTC m=+10.182022802 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7l9m" (UniqueName: "kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m") pod "network-check-target-zgwbf" (UID: "fa8cdfa0-8080-411d-bd6e-51b977229392") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:13.951483 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:13.949716 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:13.951483 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:13.949855 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:14.949263 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:14.949229 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:14.949741 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:14.949388 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:15.949442 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:15.949408 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:15.950040 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:15.949546 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:16.950495 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:16.949981 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:16.950495 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:16.950122 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:17.594583 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:17.593772 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7l9m\" (UniqueName: \"kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m\") pod \"network-check-target-zgwbf\" (UID: \"fa8cdfa0-8080-411d-bd6e-51b977229392\") " pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:17.594583 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:17.593866 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs\") pod \"network-metrics-daemon-fbmql\" (UID: \"173598bb-6dcc-46e9-a78f-f3d5c1fd4297\") " pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:17.594583 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:17.594017 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:17.594583 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:17.594080 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs podName:173598bb-6dcc-46e9-a78f-f3d5c1fd4297 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:25.594062705 +0000 UTC m=+18.216343365 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs") pod "network-metrics-daemon-fbmql" (UID: "173598bb-6dcc-46e9-a78f-f3d5c1fd4297") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:17.594583 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:17.594477 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:17.594583 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:17.594498 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:17.594583 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:17.594511 2546 projected.go:194] Error preparing data for projected volume kube-api-access-n7l9m for pod openshift-network-diagnostics/network-check-target-zgwbf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:17.594583 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:17.594560 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m podName:fa8cdfa0-8080-411d-bd6e-51b977229392 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:25.594544493 +0000 UTC m=+18.216825152 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7l9m" (UniqueName: "kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m") pod "network-check-target-zgwbf" (UID: "fa8cdfa0-8080-411d-bd6e-51b977229392") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:17.951276 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:17.950755 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:17.951276 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:17.950871 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:18.949618 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:18.949448 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:18.949618 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:18.949605 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:19.949658 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:19.949618 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:19.950155 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:19.949768 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:20.949129 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:20.949093 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:20.949391 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:20.949245 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:21.949171 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:21.949140 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:21.949562 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:21.949266 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:22.949614 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:22.949587 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:22.949987 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:22.949729 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:23.949660 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:23.949617 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:23.950124 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:23.949794 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:24.949652 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:24.949617 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:24.949845 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:24.949772 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:25.650088 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:25.649999 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7l9m\" (UniqueName: \"kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m\") pod \"network-check-target-zgwbf\" (UID: \"fa8cdfa0-8080-411d-bd6e-51b977229392\") " pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:25.650088 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:25.650057 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs\") pod \"network-metrics-daemon-fbmql\" (UID: \"173598bb-6dcc-46e9-a78f-f3d5c1fd4297\") " pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:25.650299 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:25.650167 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:25.650299 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:25.650188 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:25.650299 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:25.650191 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:25.650299 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:25.650254 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs podName:173598bb-6dcc-46e9-a78f-f3d5c1fd4297 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:41.650234091 +0000 UTC m=+34.272514760 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs") pod "network-metrics-daemon-fbmql" (UID: "173598bb-6dcc-46e9-a78f-f3d5c1fd4297") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:25.650299 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:25.650197 2546 projected.go:194] Error preparing data for projected volume kube-api-access-n7l9m for pod openshift-network-diagnostics/network-check-target-zgwbf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:25.650526 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:25.650314 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m podName:fa8cdfa0-8080-411d-bd6e-51b977229392 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:41.650301367 +0000 UTC m=+34.272582019 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7l9m" (UniqueName: "kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m") pod "network-check-target-zgwbf" (UID: "fa8cdfa0-8080-411d-bd6e-51b977229392") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:25.949832 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:25.949745 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:25.949981 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:25.949908 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:26.949533 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:26.949493 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:26.949744 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:26.949634 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:27.950249 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:27.950053 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:27.950929 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:27.950316 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:28.082114 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.082077 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" event={"ID":"53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f","Type":"ContainerStarted","Data":"d516a40328af9cf8ae40665003f6083c67900c185fb3a395e812820fd7abacc5"} Apr 17 17:25:28.083359 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.083332 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r6rvm" event={"ID":"a2e022ed-9ba3-454c-9f27-4b300f2393d6","Type":"ContainerStarted","Data":"e67e303b25cdb62d47a924ac5a4f8a39e2c71ea627730ca6577a5ec71f82e85a"} Apr 17 17:25:28.084668 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.084644 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2phxj" event={"ID":"8b1e9f99-746c-4f28-80b9-ea9eb814cd98","Type":"ContainerStarted","Data":"35e83b6b7259d1799c8af8810404703206c0c5e13718f0a372408feb8fc9ae71"} Apr 17 17:25:28.086057 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.086031 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qz758" event={"ID":"7eb9cbde-0842-4e38-ab1b-0c93d220e92a","Type":"ContainerStarted","Data":"01774602763ba4f601055f8cb4efe21221b63924e41efe7c7ee6be98fcd40819"} Apr 17 17:25:28.087325 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.087307 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" event={"ID":"c5ad42a7-476b-4a76-800f-f9c45c9f695f","Type":"ContainerStarted","Data":"e328e32c188e92c5c71980a505eebdd7f1e3c4b8c28e0162a033368525137239"} Apr 17 17:25:28.089741 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.089726 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovn-acl-logging/0.log" Apr 17 17:25:28.090007 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.089986 2546 generic.go:358] "Generic (PLEG): container finished" podID="e869ba13-1af3-46e4-bbaa-eef8b748f612" containerID="87fa98f763c99d624cfdd6f238ad4da2eb4055eb8a4501cdbcea4a5dfaa39da9" exitCode=1 Apr 17 17:25:28.090093 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.090043 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" event={"ID":"e869ba13-1af3-46e4-bbaa-eef8b748f612","Type":"ContainerStarted","Data":"27332a08b14602b328caab4ddff145ff0ae072fe6aa2a796ff057156b40e087b"} Apr 17 17:25:28.090093 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.090062 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" event={"ID":"e869ba13-1af3-46e4-bbaa-eef8b748f612","Type":"ContainerStarted","Data":"b53d005ddf4354cd2f2a513ad59500476b77c08dcf6771cdeff4255536791504"} Apr 17 17:25:28.090093 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.090075 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" event={"ID":"e869ba13-1af3-46e4-bbaa-eef8b748f612","Type":"ContainerStarted","Data":"423e9beb77c1382c49537323fe42608f7756be81874f08a236204264919792bc"} Apr 17 17:25:28.090093 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.090082 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" event={"ID":"e869ba13-1af3-46e4-bbaa-eef8b748f612","Type":"ContainerStarted","Data":"8ac18afaeb1aa572a56e57179d5c130786dbe03b728a36f59b26eaed5a2f9a96"} Apr 17 17:25:28.090093 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.090090 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" event={"ID":"e869ba13-1af3-46e4-bbaa-eef8b748f612","Type":"ContainerDied","Data":"87fa98f763c99d624cfdd6f238ad4da2eb4055eb8a4501cdbcea4a5dfaa39da9"} Apr 17 17:25:28.090295 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.090101 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" event={"ID":"e869ba13-1af3-46e4-bbaa-eef8b748f612","Type":"ContainerStarted","Data":"9de471e50596421d56e0cc0797200f25a36c32daa0dd63de467949dac06fd153"} Apr 17 17:25:28.091441 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.091423 2546 generic.go:358] "Generic (PLEG): container finished" podID="17cc3e64-292f-4b71-9d6f-6deb75cffce6" containerID="e3484491671dfce643513aaafc367fa7a70be8bf17bd2b50614c6d4da33b671d" exitCode=0 Apr 17 17:25:28.091521 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.091468 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q92kj" event={"ID":"17cc3e64-292f-4b71-9d6f-6deb75cffce6","Type":"ContainerDied","Data":"e3484491671dfce643513aaafc367fa7a70be8bf17bd2b50614c6d4da33b671d"} Apr 17 17:25:28.092632 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.092608 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-sf4kc" event={"ID":"e9676b31-a86d-4645-96b9-1bfa16b53a94","Type":"ContainerStarted","Data":"0f108fdb950e2ebddf5bf3cd95acd61adb02dc71ed9c82a79bf0250c92753c73"} Apr 17 17:25:28.102646 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.102612 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wpsq6" podStartSLOduration=3.346468781 podStartE2EDuration="20.102601212s" podCreationTimestamp="2026-04-17 17:25:08 +0000 UTC" firstStartedPulling="2026-04-17 17:25:10.503745398 +0000 UTC m=+3.126026056" lastFinishedPulling="2026-04-17 17:25:27.259877834 +0000 UTC m=+19.882158487" observedRunningTime="2026-04-17 17:25:28.102455526 +0000 UTC m=+20.724736200" watchObservedRunningTime="2026-04-17 17:25:28.102601212 +0000 UTC m=+20.724881886" Apr 17 17:25:28.103149 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.103126 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-46.ec2.internal" podStartSLOduration=19.103118945 podStartE2EDuration="19.103118945s" podCreationTimestamp="2026-04-17 17:25:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:13.059561538 +0000 UTC m=+5.681842228" watchObservedRunningTime="2026-04-17 17:25:28.103118945 +0000 UTC m=+20.725399621" Apr 17 17:25:28.119195 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.119159 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-sf4kc" podStartSLOduration=3.491827894 podStartE2EDuration="20.11914917s" podCreationTimestamp="2026-04-17 17:25:08 +0000 UTC" firstStartedPulling="2026-04-17 17:25:10.494616445 +0000 UTC m=+3.116897101" lastFinishedPulling="2026-04-17 17:25:27.121937709 +0000 UTC m=+19.744218377" observedRunningTime="2026-04-17 17:25:28.118845828 +0000 UTC m=+20.741126503" watchObservedRunningTime="2026-04-17 17:25:28.11914917 +0000 UTC m=+20.741429844" Apr 17 17:25:28.133946 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.133907 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-r6rvm" podStartSLOduration=3.37464287 podStartE2EDuration="20.133894513s" podCreationTimestamp="2026-04-17 17:25:08 +0000 UTC" firstStartedPulling="2026-04-17 17:25:10.50053854 +0000 UTC m=+3.122819194" lastFinishedPulling="2026-04-17 17:25:27.259790167 +0000 UTC m=+19.882070837" observedRunningTime="2026-04-17 17:25:28.133491926 +0000 UTC m=+20.755772602" watchObservedRunningTime="2026-04-17 17:25:28.133894513 +0000 UTC m=+20.756175188" Apr 17 17:25:28.174814 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.174763 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qz758" podStartSLOduration=4.405349342 podStartE2EDuration="21.174750812s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="2026-04-17 17:25:10.497636673 +0000 UTC m=+3.119917331" lastFinishedPulling="2026-04-17 17:25:27.267038144 +0000 UTC m=+19.889318801" observedRunningTime="2026-04-17 17:25:28.174524626 +0000 UTC m=+20.796805301" watchObservedRunningTime="2026-04-17 17:25:28.174750812 +0000 UTC m=+20.797031486" Apr 17 17:25:28.190656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.190611 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2phxj" podStartSLOduration=7.844165953 podStartE2EDuration="20.190596189s" podCreationTimestamp="2026-04-17 17:25:08 +0000 UTC" firstStartedPulling="2026-04-17 17:25:10.500138708 +0000 UTC m=+3.122419374" lastFinishedPulling="2026-04-17 17:25:22.846568953 +0000 UTC m=+15.468849610" observedRunningTime="2026-04-17 17:25:28.190553597 +0000 UTC m=+20.812834272" watchObservedRunningTime="2026-04-17 17:25:28.190596189 +0000 UTC m=+20.812876866" Apr 17 17:25:28.722965 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.722912 2546 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:25:28.879976 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.879889 2546 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:25:28.722928799Z","UUID":"4afae911-df64-4a5c-bd68-ae368b2a9282","Handler":null,"Name":"","Endpoint":""} Apr 17 17:25:28.881627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.881605 2546 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:25:28.881627 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.881634 2546 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:25:28.949090 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:28.949057 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:28.949243 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:28.949187 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:29.096632 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:29.096596 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-b4h7q" event={"ID":"19f42a69-1efb-4887-94b9-58c87acfe319","Type":"ContainerStarted","Data":"0d1f9c7307927ead28365aa703ebc779d2037b431778dcea17f1f0ba7b1b071e"} Apr 17 17:25:29.098748 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:29.098666 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" event={"ID":"c5ad42a7-476b-4a76-800f-f9c45c9f695f","Type":"ContainerStarted","Data":"aa7279a83399fbd3a1683ea8ce2694c0708f85ccb0f21371ec1f6bbfb41cb4cf"} Apr 17 17:25:29.113539 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:29.113499 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-b4h7q" podStartSLOduration=4.358548477 podStartE2EDuration="21.113484689s" podCreationTimestamp="2026-04-17 17:25:08 +0000 UTC" firstStartedPulling="2026-04-17 17:25:10.50484975 +0000 UTC m=+3.127130405" lastFinishedPulling="2026-04-17 17:25:27.259785835 +0000 UTC m=+19.882066617" observedRunningTime="2026-04-17 17:25:29.113193052 +0000 UTC m=+21.735473726" watchObservedRunningTime="2026-04-17 17:25:29.113484689 +0000 UTC m=+21.735765409" Apr 17 17:25:29.949962 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:29.949877 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:29.950137 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:29.950001 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:30.102665 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:30.102638 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" event={"ID":"c5ad42a7-476b-4a76-800f-f9c45c9f695f","Type":"ContainerStarted","Data":"d11a38e12eae85cebe0f3233b38e553ef20a40e1c820246404ba10092b382cf4"} Apr 17 17:25:30.127101 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:30.127043 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z7qzz" podStartSLOduration=3.964572005 podStartE2EDuration="23.127023516s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="2026-04-17 17:25:10.496928678 +0000 UTC m=+3.119209346" lastFinishedPulling="2026-04-17 17:25:29.659380202 +0000 UTC m=+22.281660857" observedRunningTime="2026-04-17 17:25:30.126410177 +0000 UTC m=+22.748690859" watchObservedRunningTime="2026-04-17 17:25:30.127023516 +0000 UTC m=+22.749304194" Apr 17 17:25:30.949661 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:30.949454 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:30.949847 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:30.949774 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:31.107828 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:31.107799 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovn-acl-logging/0.log" Apr 17 17:25:31.108224 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:31.108189 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" event={"ID":"e869ba13-1af3-46e4-bbaa-eef8b748f612","Type":"ContainerStarted","Data":"4a20473aa2a0e88085241cdca1a39ccfae20effa5727d71af9f7499c5309b97a"} Apr 17 17:25:31.328955 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:31.328921 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-sf4kc" Apr 17 17:25:31.329634 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:31.329602 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-sf4kc" Apr 17 17:25:31.949694 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:31.949647 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:31.949877 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:31.949794 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:32.949788 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:32.949758 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:32.950461 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:32.949867 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:33.114436 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:33.114407 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovn-acl-logging/0.log" Apr 17 17:25:33.114782 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:33.114758 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" event={"ID":"e869ba13-1af3-46e4-bbaa-eef8b748f612","Type":"ContainerStarted","Data":"18bbc572780b541879008891b30e6bd40f50cd0584752f3cec9de70d7f769015"} Apr 17 17:25:33.115016 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:33.115001 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:33.115206 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:33.115192 2546 scope.go:117] "RemoveContainer" containerID="87fa98f763c99d624cfdd6f238ad4da2eb4055eb8a4501cdbcea4a5dfaa39da9" Apr 17 17:25:33.116347 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:33.116321 2546 generic.go:358] "Generic (PLEG): container finished" podID="17cc3e64-292f-4b71-9d6f-6deb75cffce6" containerID="e9840495d5354db1050375d10204d0306e9e72a4e0b5836afe5ddb7bfa7dcbbb" exitCode=0 Apr 17 17:25:33.116423 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:33.116361 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q92kj" event={"ID":"17cc3e64-292f-4b71-9d6f-6deb75cffce6","Type":"ContainerDied","Data":"e9840495d5354db1050375d10204d0306e9e72a4e0b5836afe5ddb7bfa7dcbbb"} Apr 17 17:25:33.131324 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:33.131303 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:33.659561 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:33.659527 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-sf4kc" Apr 17 17:25:33.659735 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:33.659671 2546 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 17:25:33.660177 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:33.660152 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-sf4kc" Apr 17 17:25:33.949084 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:33.949060 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:33.949207 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:33.949185 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:34.121087 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:34.121059 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovn-acl-logging/0.log" Apr 17 17:25:34.121538 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:34.121380 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" event={"ID":"e869ba13-1af3-46e4-bbaa-eef8b748f612","Type":"ContainerStarted","Data":"02f0d3e03854373f3ca62516ad9c68c84ada298be094fd306652bc4f22cc2165"} Apr 17 17:25:34.121717 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:34.121699 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:34.121816 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:34.121724 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:34.123288 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:34.123265 2546 generic.go:358] "Generic (PLEG): container finished" podID="17cc3e64-292f-4b71-9d6f-6deb75cffce6" containerID="15c79877b71eadbfb6e07baf5071a65bc06bc402f8bc751c8e8c373ef5512d1d" exitCode=0 Apr 17 17:25:34.123394 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:34.123355 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q92kj" event={"ID":"17cc3e64-292f-4b71-9d6f-6deb75cffce6","Type":"ContainerDied","Data":"15c79877b71eadbfb6e07baf5071a65bc06bc402f8bc751c8e8c373ef5512d1d"} Apr 17 17:25:34.136754 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:34.136733 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:25:34.146275 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:34.146236 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" podStartSLOduration=9.316151872 podStartE2EDuration="26.146224517s" podCreationTimestamp="2026-04-17 17:25:08 +0000 UTC" firstStartedPulling="2026-04-17 17:25:10.505927761 +0000 UTC m=+3.128208422" lastFinishedPulling="2026-04-17 17:25:27.33600041 +0000 UTC m=+19.958281067" observedRunningTime="2026-04-17 17:25:34.145470719 +0000 UTC m=+26.767751393" watchObservedRunningTime="2026-04-17 17:25:34.146224517 +0000 UTC m=+26.768505191" Apr 17 17:25:34.263343 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:34.263259 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zgwbf"] Apr 17 17:25:34.263499 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:34.263391 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:34.263551 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:34.263492 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:34.264016 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:34.263995 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fbmql"] Apr 17 17:25:34.264108 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:34.264088 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:34.264219 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:34.264189 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:35.127251 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:35.127218 2546 generic.go:358] "Generic (PLEG): container finished" podID="17cc3e64-292f-4b71-9d6f-6deb75cffce6" containerID="46825358bfb0ffad741b97acea45c41206a9290d3dbdac674a347e63e779c33d" exitCode=0 Apr 17 17:25:35.127626 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:35.127298 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q92kj" event={"ID":"17cc3e64-292f-4b71-9d6f-6deb75cffce6","Type":"ContainerDied","Data":"46825358bfb0ffad741b97acea45c41206a9290d3dbdac674a347e63e779c33d"} Apr 17 17:25:35.949566 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:35.949532 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:35.949768 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:35.949532 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:35.949768 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:35.949673 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:35.949768 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:35.949722 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:37.950871 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:37.950628 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:37.951273 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:37.950723 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:37.951273 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:37.950965 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:37.951273 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:37.951023 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:39.949621 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:39.949590 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:39.950038 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:39.949747 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgwbf" podUID="fa8cdfa0-8080-411d-bd6e-51b977229392" Apr 17 17:25:39.950038 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:39.949814 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:39.950038 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:39.949949 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:25:40.223511 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.223433 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-46.ec2.internal" event="NodeReady" Apr 17 17:25:40.223788 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.223643 2546 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:25:40.271543 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.271504 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sm5m5"] Apr 17 17:25:40.289458 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.289428 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kgnl9"] Apr 17 17:25:40.290220 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.289828 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:40.293527 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.292421 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w4qb9\"" Apr 17 17:25:40.293527 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.292659 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:25:40.293527 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.292854 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:25:40.302966 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.302943 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sm5m5"] Apr 17 17:25:40.303101 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.302971 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kgnl9"] Apr 17 17:25:40.303101 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.303080 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:25:40.305632 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.305609 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:25:40.305632 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.305627 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fhhs9\"" Apr 17 17:25:40.305811 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.305633 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:25:40.305811 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.305765 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:25:40.358612 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.358580 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d82d71b1-2458-4671-b28c-5e3870cd761a-tmp-dir\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:40.358783 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.358706 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57tps\" (UniqueName: \"kubernetes.io/projected/d82d71b1-2458-4671-b28c-5e3870cd761a-kube-api-access-57tps\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:40.358783 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.358762 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d82d71b1-2458-4671-b28c-5e3870cd761a-config-volume\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:40.358897 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.358793 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:40.459145 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.459105 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d82d71b1-2458-4671-b28c-5e3870cd761a-config-volume\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:40.459357 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.459157 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b69r\" (UniqueName: \"kubernetes.io/projected/e4397ebe-1923-4566-89e4-f777e71713b1-kube-api-access-2b69r\") pod \"ingress-canary-kgnl9\" (UID: \"e4397ebe-1923-4566-89e4-f777e71713b1\") " pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:25:40.459357 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.459190 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:40.459357 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.459227 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d82d71b1-2458-4671-b28c-5e3870cd761a-tmp-dir\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:40.459357 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.459295 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert\") pod \"ingress-canary-kgnl9\" (UID: \"e4397ebe-1923-4566-89e4-f777e71713b1\") " pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:25:40.459357 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.459339 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57tps\" (UniqueName: \"kubernetes.io/projected/d82d71b1-2458-4671-b28c-5e3870cd761a-kube-api-access-57tps\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:40.459621 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:40.459420 2546 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:40.459621 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:40.459478 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls podName:d82d71b1-2458-4671-b28c-5e3870cd761a nodeName:}" failed. No retries permitted until 2026-04-17 17:25:40.959458027 +0000 UTC m=+33.581738695 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls") pod "dns-default-sm5m5" (UID: "d82d71b1-2458-4671-b28c-5e3870cd761a") : secret "dns-default-metrics-tls" not found Apr 17 17:25:40.459621 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.459597 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d82d71b1-2458-4671-b28c-5e3870cd761a-tmp-dir\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:40.459838 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.459818 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d82d71b1-2458-4671-b28c-5e3870cd761a-config-volume\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:40.470281 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.470257 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57tps\" (UniqueName: \"kubernetes.io/projected/d82d71b1-2458-4671-b28c-5e3870cd761a-kube-api-access-57tps\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:40.560333 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.560298 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2b69r\" (UniqueName: \"kubernetes.io/projected/e4397ebe-1923-4566-89e4-f777e71713b1-kube-api-access-2b69r\") pod \"ingress-canary-kgnl9\" (UID: \"e4397ebe-1923-4566-89e4-f777e71713b1\") " pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:25:40.560490 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.560365 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert\") pod \"ingress-canary-kgnl9\" (UID: \"e4397ebe-1923-4566-89e4-f777e71713b1\") " pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:25:40.560490 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:40.560452 2546 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:40.560563 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:40.560503 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert podName:e4397ebe-1923-4566-89e4-f777e71713b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:41.060489891 +0000 UTC m=+33.682770544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert") pod "ingress-canary-kgnl9" (UID: "e4397ebe-1923-4566-89e4-f777e71713b1") : secret "canary-serving-cert" not found Apr 17 17:25:40.571150 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.571127 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b69r\" (UniqueName: \"kubernetes.io/projected/e4397ebe-1923-4566-89e4-f777e71713b1-kube-api-access-2b69r\") pod \"ingress-canary-kgnl9\" (UID: \"e4397ebe-1923-4566-89e4-f777e71713b1\") " pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:25:40.962648 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:40.962618 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:40.963218 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:40.962773 2546 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:40.963218 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:40.962846 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls podName:d82d71b1-2458-4671-b28c-5e3870cd761a nodeName:}" failed. No retries permitted until 2026-04-17 17:25:41.962830249 +0000 UTC m=+34.585110907 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls") pod "dns-default-sm5m5" (UID: "d82d71b1-2458-4671-b28c-5e3870cd761a") : secret "dns-default-metrics-tls" not found Apr 17 17:25:41.063991 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:41.063965 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert\") pod \"ingress-canary-kgnl9\" (UID: \"e4397ebe-1923-4566-89e4-f777e71713b1\") " pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:25:41.064098 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:41.064071 2546 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:41.064152 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:41.064126 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert podName:e4397ebe-1923-4566-89e4-f777e71713b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:42.064111492 +0000 UTC m=+34.686392165 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert") pod "ingress-canary-kgnl9" (UID: "e4397ebe-1923-4566-89e4-f777e71713b1") : secret "canary-serving-cert" not found Apr 17 17:25:41.145234 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:41.145150 2546 generic.go:358] "Generic (PLEG): container finished" podID="17cc3e64-292f-4b71-9d6f-6deb75cffce6" containerID="562dc930296730c81e1a5ca13c98af3f1a0720f1f772c5a1543277f023729bbc" exitCode=0 Apr 17 17:25:41.145234 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:41.145198 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q92kj" event={"ID":"17cc3e64-292f-4b71-9d6f-6deb75cffce6","Type":"ContainerDied","Data":"562dc930296730c81e1a5ca13c98af3f1a0720f1f772c5a1543277f023729bbc"} Apr 17 17:25:41.668691 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:41.668654 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7l9m\" (UniqueName: \"kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m\") pod \"network-check-target-zgwbf\" (UID: \"fa8cdfa0-8080-411d-bd6e-51b977229392\") " pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:41.668843 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:41.668725 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs\") pod \"network-metrics-daemon-fbmql\" (UID: \"173598bb-6dcc-46e9-a78f-f3d5c1fd4297\") " pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:41.668843 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:41.668818 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:41.668843 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:41.668829 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:41.668959 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:41.668850 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:41.668959 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:41.668863 2546 projected.go:194] Error preparing data for projected volume kube-api-access-n7l9m for pod openshift-network-diagnostics/network-check-target-zgwbf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:41.668959 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:41.668889 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs podName:173598bb-6dcc-46e9-a78f-f3d5c1fd4297 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:13.668867703 +0000 UTC m=+66.291148359 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs") pod "network-metrics-daemon-fbmql" (UID: "173598bb-6dcc-46e9-a78f-f3d5c1fd4297") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:41.668959 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:41.668905 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m podName:fa8cdfa0-8080-411d-bd6e-51b977229392 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:13.668897135 +0000 UTC m=+66.291177788 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-n7l9m" (UniqueName: "kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m") pod "network-check-target-zgwbf" (UID: "fa8cdfa0-8080-411d-bd6e-51b977229392") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:41.949919 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:41.949883 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:25:41.950100 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:41.949883 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:25:41.952571 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:41.952545 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:25:41.952735 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:41.952553 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lkqvr\"" Apr 17 17:25:41.952735 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:41.952551 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:25:41.952864 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:41.952763 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:25:41.953692 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:41.953657 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wl5wm\"" Apr 17 17:25:41.970577 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:41.970560 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:41.970921 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:41.970647 2546 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:41.970921 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:41.970707 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls podName:d82d71b1-2458-4671-b28c-5e3870cd761a nodeName:}" failed. No retries permitted until 2026-04-17 17:25:43.9706941 +0000 UTC m=+36.592974770 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls") pod "dns-default-sm5m5" (UID: "d82d71b1-2458-4671-b28c-5e3870cd761a") : secret "dns-default-metrics-tls" not found Apr 17 17:25:42.071704 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:42.071666 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert\") pod \"ingress-canary-kgnl9\" (UID: \"e4397ebe-1923-4566-89e4-f777e71713b1\") " pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:25:42.071830 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:42.071810 2546 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:42.072013 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:42.071888 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert podName:e4397ebe-1923-4566-89e4-f777e71713b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:44.071867298 +0000 UTC m=+36.694147956 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert") pod "ingress-canary-kgnl9" (UID: "e4397ebe-1923-4566-89e4-f777e71713b1") : secret "canary-serving-cert" not found Apr 17 17:25:42.154942 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:42.154902 2546 generic.go:358] "Generic (PLEG): container finished" podID="17cc3e64-292f-4b71-9d6f-6deb75cffce6" containerID="91acd1ac4cacf706519aad60809d132ddb409bd7d0e6942b5d89592a8c7841fc" exitCode=0 Apr 17 17:25:42.155090 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:42.154966 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q92kj" event={"ID":"17cc3e64-292f-4b71-9d6f-6deb75cffce6","Type":"ContainerDied","Data":"91acd1ac4cacf706519aad60809d132ddb409bd7d0e6942b5d89592a8c7841fc"} Apr 17 17:25:43.159800 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:43.159769 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q92kj" event={"ID":"17cc3e64-292f-4b71-9d6f-6deb75cffce6","Type":"ContainerStarted","Data":"33a869f54de9bd6bb07719c5b86f1ba43d131f7de8c5a0c4d8800ae9d2a98b66"} Apr 17 17:25:43.185925 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:43.185878 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-q92kj" podStartSLOduration=4.888201766 podStartE2EDuration="35.185863438s" podCreationTimestamp="2026-04-17 17:25:08 +0000 UTC" firstStartedPulling="2026-04-17 17:25:10.502391406 +0000 UTC m=+3.124672060" lastFinishedPulling="2026-04-17 17:25:40.800053076 +0000 UTC m=+33.422333732" observedRunningTime="2026-04-17 17:25:43.185288633 +0000 UTC m=+35.807569309" watchObservedRunningTime="2026-04-17 17:25:43.185863438 +0000 UTC m=+35.808144161" Apr 17 17:25:43.982567 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:43.982530 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:43.982718 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:43.982644 2546 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:43.982718 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:43.982707 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls podName:d82d71b1-2458-4671-b28c-5e3870cd761a nodeName:}" failed. No retries permitted until 2026-04-17 17:25:47.982693266 +0000 UTC m=+40.604973919 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls") pod "dns-default-sm5m5" (UID: "d82d71b1-2458-4671-b28c-5e3870cd761a") : secret "dns-default-metrics-tls" not found Apr 17 17:25:44.083514 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:44.083435 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert\") pod \"ingress-canary-kgnl9\" (UID: \"e4397ebe-1923-4566-89e4-f777e71713b1\") " pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:25:44.083649 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:44.083550 2546 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:44.083649 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:44.083600 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert podName:e4397ebe-1923-4566-89e4-f777e71713b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:48.083587474 +0000 UTC m=+40.705868128 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert") pod "ingress-canary-kgnl9" (UID: "e4397ebe-1923-4566-89e4-f777e71713b1") : secret "canary-serving-cert" not found Apr 17 17:25:48.008753 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:48.008547 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:48.009205 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:48.008699 2546 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:48.009205 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:48.008816 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls podName:d82d71b1-2458-4671-b28c-5e3870cd761a nodeName:}" failed. No retries permitted until 2026-04-17 17:25:56.008802277 +0000 UTC m=+48.631082930 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls") pod "dns-default-sm5m5" (UID: "d82d71b1-2458-4671-b28c-5e3870cd761a") : secret "dns-default-metrics-tls" not found Apr 17 17:25:48.109218 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:48.109175 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert\") pod \"ingress-canary-kgnl9\" (UID: \"e4397ebe-1923-4566-89e4-f777e71713b1\") " pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:25:48.109349 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:48.109315 2546 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:48.109404 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:48.109378 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert podName:e4397ebe-1923-4566-89e4-f777e71713b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:56.109365269 +0000 UTC m=+48.731645921 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert") pod "ingress-canary-kgnl9" (UID: "e4397ebe-1923-4566-89e4-f777e71713b1") : secret "canary-serving-cert" not found Apr 17 17:25:56.061645 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:56.061612 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:25:56.062113 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:56.061765 2546 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:56.062113 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:56.061835 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls podName:d82d71b1-2458-4671-b28c-5e3870cd761a nodeName:}" failed. No retries permitted until 2026-04-17 17:26:12.061819611 +0000 UTC m=+64.684100263 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls") pod "dns-default-sm5m5" (UID: "d82d71b1-2458-4671-b28c-5e3870cd761a") : secret "dns-default-metrics-tls" not found Apr 17 17:25:56.162445 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:25:56.162414 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert\") pod \"ingress-canary-kgnl9\" (UID: \"e4397ebe-1923-4566-89e4-f777e71713b1\") " pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:25:56.162640 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:56.162522 2546 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:56.162640 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:25:56.162580 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert podName:e4397ebe-1923-4566-89e4-f777e71713b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:12.162566358 +0000 UTC m=+64.784847011 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert") pod "ingress-canary-kgnl9" (UID: "e4397ebe-1923-4566-89e4-f777e71713b1") : secret "canary-serving-cert" not found Apr 17 17:26:06.139461 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:06.139433 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ft44m" Apr 17 17:26:12.076641 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:12.076597 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:26:12.077126 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:26:12.076746 2546 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:12.077126 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:26:12.076832 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls podName:d82d71b1-2458-4671-b28c-5e3870cd761a nodeName:}" failed. No retries permitted until 2026-04-17 17:26:44.076809746 +0000 UTC m=+96.699090422 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls") pod "dns-default-sm5m5" (UID: "d82d71b1-2458-4671-b28c-5e3870cd761a") : secret "dns-default-metrics-tls" not found Apr 17 17:26:12.177169 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:12.177125 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert\") pod \"ingress-canary-kgnl9\" (UID: \"e4397ebe-1923-4566-89e4-f777e71713b1\") " pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:26:12.177341 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:26:12.177278 2546 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:12.177387 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:26:12.177356 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert podName:e4397ebe-1923-4566-89e4-f777e71713b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:44.177339963 +0000 UTC m=+96.799620617 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert") pod "ingress-canary-kgnl9" (UID: "e4397ebe-1923-4566-89e4-f777e71713b1") : secret "canary-serving-cert" not found Apr 17 17:26:13.686903 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:13.686850 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7l9m\" (UniqueName: \"kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m\") pod \"network-check-target-zgwbf\" (UID: \"fa8cdfa0-8080-411d-bd6e-51b977229392\") " pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:26:13.687351 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:13.686923 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs\") pod \"network-metrics-daemon-fbmql\" (UID: \"173598bb-6dcc-46e9-a78f-f3d5c1fd4297\") " pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:26:13.689779 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:13.689758 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:26:13.689854 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:13.689829 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:26:13.697718 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:26:13.697699 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:26:13.697772 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:26:13.697763 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs podName:173598bb-6dcc-46e9-a78f-f3d5c1fd4297 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:17.697747043 +0000 UTC m=+130.320027702 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs") pod "network-metrics-daemon-fbmql" (UID: "173598bb-6dcc-46e9-a78f-f3d5c1fd4297") : secret "metrics-daemon-secret" not found Apr 17 17:26:13.700099 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:13.700084 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:26:13.710917 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:13.710886 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7l9m\" (UniqueName: \"kubernetes.io/projected/fa8cdfa0-8080-411d-bd6e-51b977229392-kube-api-access-n7l9m\") pod \"network-check-target-zgwbf\" (UID: \"fa8cdfa0-8080-411d-bd6e-51b977229392\") " pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:26:13.766423 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:13.766395 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wl5wm\"" Apr 17 17:26:13.774689 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:13.774659 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:26:13.897707 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:13.897660 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zgwbf"] Apr 17 17:26:13.901886 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:26:13.901861 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa8cdfa0_8080_411d_bd6e_51b977229392.slice/crio-cbbaa57d2ae1b724b8aec384178794ee5971d18e53cbbdfeebc99cae2a7c5f1e WatchSource:0}: Error finding container cbbaa57d2ae1b724b8aec384178794ee5971d18e53cbbdfeebc99cae2a7c5f1e: Status 404 returned error can't find the container with id cbbaa57d2ae1b724b8aec384178794ee5971d18e53cbbdfeebc99cae2a7c5f1e Apr 17 17:26:14.219160 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:14.219124 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zgwbf" event={"ID":"fa8cdfa0-8080-411d-bd6e-51b977229392","Type":"ContainerStarted","Data":"cbbaa57d2ae1b724b8aec384178794ee5971d18e53cbbdfeebc99cae2a7c5f1e"} Apr 17 17:26:17.225791 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:17.225753 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zgwbf" event={"ID":"fa8cdfa0-8080-411d-bd6e-51b977229392","Type":"ContainerStarted","Data":"f53e2f608b7a1a4a7559981b4eea503b04a6adc312edecb7f36413529f83f113"} Apr 17 17:26:17.226182 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:17.225863 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:26:17.246599 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:17.246549 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zgwbf" podStartSLOduration=66.740763724 podStartE2EDuration="1m9.246534807s" podCreationTimestamp="2026-04-17 17:25:08 +0000 UTC" firstStartedPulling="2026-04-17 17:26:13.903613357 +0000 UTC m=+66.525894014" lastFinishedPulling="2026-04-17 17:26:16.409384435 +0000 UTC m=+69.031665097" observedRunningTime="2026-04-17 17:26:17.245764032 +0000 UTC m=+69.868044720" watchObservedRunningTime="2026-04-17 17:26:17.246534807 +0000 UTC m=+69.868815481" Apr 17 17:26:44.103157 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:44.103071 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:26:44.103552 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:26:44.103216 2546 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:44.103552 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:26:44.103279 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls podName:d82d71b1-2458-4671-b28c-5e3870cd761a nodeName:}" failed. No retries permitted until 2026-04-17 17:27:48.103265213 +0000 UTC m=+160.725545871 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls") pod "dns-default-sm5m5" (UID: "d82d71b1-2458-4671-b28c-5e3870cd761a") : secret "dns-default-metrics-tls" not found Apr 17 17:26:44.203448 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:44.203409 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert\") pod \"ingress-canary-kgnl9\" (UID: \"e4397ebe-1923-4566-89e4-f777e71713b1\") " pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:26:44.203597 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:26:44.203559 2546 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:44.203639 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:26:44.203628 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert podName:e4397ebe-1923-4566-89e4-f777e71713b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:48.203612597 +0000 UTC m=+160.825893250 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert") pod "ingress-canary-kgnl9" (UID: "e4397ebe-1923-4566-89e4-f777e71713b1") : secret "canary-serving-cert" not found Apr 17 17:26:48.230296 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:26:48.230266 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zgwbf" Apr 17 17:27:01.192572 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.192534 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-99kn4"] Apr 17 17:27:01.197092 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.197070 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.199441 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.199412 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 17:27:01.199602 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.199574 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-w78ht\"" Apr 17 17:27:01.200298 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.200278 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:27:01.200401 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.200336 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:27:01.200570 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.200552 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 17:27:01.204871 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.204852 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-99kn4"] Apr 17 17:27:01.208027 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.208008 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 17:27:01.300784 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.300752 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47"] Apr 17 17:27:01.303739 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.303722 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-cc85759b6-w8jxm"] Apr 17 17:27:01.303894 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.303878 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:01.306645 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.306625 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 17:27:01.306756 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.306658 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:01.307294 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.307276 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:27:01.307366 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.307299 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-wh4kv\"" Apr 17 17:27:01.308473 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.308457 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:27:01.308525 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.308474 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 17:27:01.317883 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.317866 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47"] Apr 17 17:27:01.318133 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.318117 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 17:27:01.318223 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.318207 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kvtqb\"" Apr 17 17:27:01.318400 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.318385 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 17:27:01.318459 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.318444 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 17:27:01.318512 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.318465 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 17:27:01.318512 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.318500 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 17:27:01.318606 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.318452 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 17:27:01.320077 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.320053 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-tmp\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.320174 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.320107 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-serving-cert\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.320174 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.320131 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-service-ca-bundle\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.320174 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.320154 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7wlb\" (UniqueName: \"kubernetes.io/projected/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-kube-api-access-s7wlb\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.320305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.320197 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.320305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.320224 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-snapshots\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.334850 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.334818 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-cc85759b6-w8jxm"] Apr 17 17:27:01.421068 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421025 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2mdx\" (UniqueName: \"kubernetes.io/projected/61d0dab8-9f71-4bee-b48b-178b647667dd-kube-api-access-t2mdx\") pod \"cluster-monitoring-operator-75587bd455-8kt47\" (UID: \"61d0dab8-9f71-4bee-b48b-178b647667dd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:01.421068 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421063 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/61d0dab8-9f71-4bee-b48b-178b647667dd-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8kt47\" (UID: \"61d0dab8-9f71-4bee-b48b-178b647667dd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:01.421318 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421080 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8kt47\" (UID: \"61d0dab8-9f71-4bee-b48b-178b647667dd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:01.421318 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421102 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-snapshots\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.421318 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421138 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-tmp\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.421318 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421154 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-stats-auth\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:01.421318 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421179 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-default-certificate\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:01.421318 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421195 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:01.421318 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421217 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-serving-cert\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.421318 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421243 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-service-ca-bundle\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.421318 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421270 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7wlb\" (UniqueName: \"kubernetes.io/projected/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-kube-api-access-s7wlb\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.421318 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421298 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:01.421781 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421375 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.421781 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421401 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlhrx\" (UniqueName: \"kubernetes.io/projected/e0ddb199-4f09-4f38-9d09-304ed7807840-kube-api-access-rlhrx\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:01.421781 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421753 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-tmp\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.421925 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421869 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-snapshots\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.421925 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.421885 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-service-ca-bundle\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.422301 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.422281 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.423639 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.423619 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-serving-cert\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.429693 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.429663 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7wlb\" (UniqueName: \"kubernetes.io/projected/e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc-kube-api-access-s7wlb\") pod \"insights-operator-585dfdc468-99kn4\" (UID: \"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc\") " pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.506869 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.506788 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-99kn4" Apr 17 17:27:01.521668 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.521639 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlhrx\" (UniqueName: \"kubernetes.io/projected/e0ddb199-4f09-4f38-9d09-304ed7807840-kube-api-access-rlhrx\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:01.521810 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.521698 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2mdx\" (UniqueName: \"kubernetes.io/projected/61d0dab8-9f71-4bee-b48b-178b647667dd-kube-api-access-t2mdx\") pod \"cluster-monitoring-operator-75587bd455-8kt47\" (UID: \"61d0dab8-9f71-4bee-b48b-178b647667dd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:01.521810 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.521730 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/61d0dab8-9f71-4bee-b48b-178b647667dd-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8kt47\" (UID: \"61d0dab8-9f71-4bee-b48b-178b647667dd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:01.521924 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.521900 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8kt47\" (UID: \"61d0dab8-9f71-4bee-b48b-178b647667dd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:01.522017 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.521998 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-stats-auth\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:01.522072 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:01.522002 2546 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:27:01.522072 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.522054 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-default-certificate\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:01.522172 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.522079 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:01.522172 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:01.522118 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls podName:61d0dab8-9f71-4bee-b48b-178b647667dd nodeName:}" failed. No retries permitted until 2026-04-17 17:27:02.022096403 +0000 UTC m=+114.644377059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8kt47" (UID: "61d0dab8-9f71-4bee-b48b-178b647667dd") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:27:01.522172 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:01.522160 2546 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:27:01.522335 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.522173 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:01.522335 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:01.522218 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs podName:e0ddb199-4f09-4f38-9d09-304ed7807840 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:02.022201396 +0000 UTC m=+114.644482071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs") pod "router-default-cc85759b6-w8jxm" (UID: "e0ddb199-4f09-4f38-9d09-304ed7807840") : secret "router-metrics-certs-default" not found Apr 17 17:27:01.522335 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:01.522272 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle podName:e0ddb199-4f09-4f38-9d09-304ed7807840 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:02.022260693 +0000 UTC m=+114.644541366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle") pod "router-default-cc85759b6-w8jxm" (UID: "e0ddb199-4f09-4f38-9d09-304ed7807840") : configmap references non-existent config key: service-ca.crt Apr 17 17:27:01.525035 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.522888 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/61d0dab8-9f71-4bee-b48b-178b647667dd-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8kt47\" (UID: \"61d0dab8-9f71-4bee-b48b-178b647667dd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:01.525035 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.525529 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-stats-auth\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:01.526207 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.526090 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-default-certificate\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:01.531058 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.531029 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2mdx\" (UniqueName: \"kubernetes.io/projected/61d0dab8-9f71-4bee-b48b-178b647667dd-kube-api-access-t2mdx\") pod \"cluster-monitoring-operator-75587bd455-8kt47\" (UID: \"61d0dab8-9f71-4bee-b48b-178b647667dd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:01.531430 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.531411 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlhrx\" (UniqueName: \"kubernetes.io/projected/e0ddb199-4f09-4f38-9d09-304ed7807840-kube-api-access-rlhrx\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:01.620117 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:01.620083 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-99kn4"] Apr 17 17:27:01.623266 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:01.623241 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8e28f97_fdcd_4cdd_bc14_75b5ce293dcc.slice/crio-50d0b6809416a4358a96a8113635fb6de8206b093f88a34aa4aa60553ad9a974 WatchSource:0}: Error finding container 50d0b6809416a4358a96a8113635fb6de8206b093f88a34aa4aa60553ad9a974: Status 404 returned error can't find the container with id 50d0b6809416a4358a96a8113635fb6de8206b093f88a34aa4aa60553ad9a974 Apr 17 17:27:02.027042 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:02.026987 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8kt47\" (UID: \"61d0dab8-9f71-4bee-b48b-178b647667dd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:02.027239 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:02.027068 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:02.027239 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:02.027097 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:02.027239 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:02.027143 2546 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:27:02.027239 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:02.027190 2546 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:27:02.027239 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:02.027206 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle podName:e0ddb199-4f09-4f38-9d09-304ed7807840 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:03.027193266 +0000 UTC m=+115.649473919 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle") pod "router-default-cc85759b6-w8jxm" (UID: "e0ddb199-4f09-4f38-9d09-304ed7807840") : configmap references non-existent config key: service-ca.crt Apr 17 17:27:02.027239 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:02.027220 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls podName:61d0dab8-9f71-4bee-b48b-178b647667dd nodeName:}" failed. No retries permitted until 2026-04-17 17:27:03.027213926 +0000 UTC m=+115.649494579 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8kt47" (UID: "61d0dab8-9f71-4bee-b48b-178b647667dd") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:27:02.027239 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:02.027231 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs podName:e0ddb199-4f09-4f38-9d09-304ed7807840 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:03.027224696 +0000 UTC m=+115.649505349 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs") pod "router-default-cc85759b6-w8jxm" (UID: "e0ddb199-4f09-4f38-9d09-304ed7807840") : secret "router-metrics-certs-default" not found Apr 17 17:27:02.309146 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:02.309112 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-99kn4" event={"ID":"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc","Type":"ContainerStarted","Data":"50d0b6809416a4358a96a8113635fb6de8206b093f88a34aa4aa60553ad9a974"} Apr 17 17:27:03.035259 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:03.035213 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8kt47\" (UID: \"61d0dab8-9f71-4bee-b48b-178b647667dd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:03.035444 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:03.035320 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:03.035444 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:03.035368 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:03.035444 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:03.035381 2546 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:27:03.035599 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:03.035467 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls podName:61d0dab8-9f71-4bee-b48b-178b647667dd nodeName:}" failed. No retries permitted until 2026-04-17 17:27:05.035444025 +0000 UTC m=+117.657724697 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8kt47" (UID: "61d0dab8-9f71-4bee-b48b-178b647667dd") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:27:03.035599 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:03.035466 2546 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:27:03.035599 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:03.035506 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle podName:e0ddb199-4f09-4f38-9d09-304ed7807840 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:05.035487789 +0000 UTC m=+117.657768450 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle") pod "router-default-cc85759b6-w8jxm" (UID: "e0ddb199-4f09-4f38-9d09-304ed7807840") : configmap references non-existent config key: service-ca.crt Apr 17 17:27:03.035599 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:03.035526 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs podName:e0ddb199-4f09-4f38-9d09-304ed7807840 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:05.035516127 +0000 UTC m=+117.657796783 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs") pod "router-default-cc85759b6-w8jxm" (UID: "e0ddb199-4f09-4f38-9d09-304ed7807840") : secret "router-metrics-certs-default" not found Apr 17 17:27:04.314899 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:04.314862 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-99kn4" event={"ID":"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc","Type":"ContainerStarted","Data":"b9c47d4c4af871ff271e96cd120cf2064cb5e8b95261235922095eb3e37b897c"} Apr 17 17:27:04.333280 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:04.333222 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-99kn4" podStartSLOduration=1.4465141049999999 podStartE2EDuration="3.33320537s" podCreationTimestamp="2026-04-17 17:27:01 +0000 UTC" firstStartedPulling="2026-04-17 17:27:01.624839716 +0000 UTC m=+114.247120369" lastFinishedPulling="2026-04-17 17:27:03.511530968 +0000 UTC m=+116.133811634" observedRunningTime="2026-04-17 17:27:04.332774918 +0000 UTC m=+116.955055592" watchObservedRunningTime="2026-04-17 17:27:04.33320537 +0000 UTC m=+116.955486047" Apr 17 17:27:05.052890 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:05.052848 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8kt47\" (UID: \"61d0dab8-9f71-4bee-b48b-178b647667dd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:05.053092 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:05.052916 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:05.053092 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:05.052946 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:05.053092 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:05.052995 2546 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:27:05.053092 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:05.053059 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle podName:e0ddb199-4f09-4f38-9d09-304ed7807840 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:09.05304556 +0000 UTC m=+121.675326212 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle") pod "router-default-cc85759b6-w8jxm" (UID: "e0ddb199-4f09-4f38-9d09-304ed7807840") : configmap references non-existent config key: service-ca.crt Apr 17 17:27:05.053092 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:05.053069 2546 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:27:05.053295 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:05.053076 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls podName:61d0dab8-9f71-4bee-b48b-178b647667dd nodeName:}" failed. No retries permitted until 2026-04-17 17:27:09.053069913 +0000 UTC m=+121.675350565 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8kt47" (UID: "61d0dab8-9f71-4bee-b48b-178b647667dd") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:27:05.053295 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:05.053139 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs podName:e0ddb199-4f09-4f38-9d09-304ed7807840 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:09.053121617 +0000 UTC m=+121.675402273 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs") pod "router-default-cc85759b6-w8jxm" (UID: "e0ddb199-4f09-4f38-9d09-304ed7807840") : secret "router-metrics-certs-default" not found Apr 17 17:27:06.876391 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:06.876368 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-r6rvm_a2e022ed-9ba3-454c-9f27-4b300f2393d6/dns-node-resolver/0.log" Apr 17 17:27:07.276601 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:07.276532 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2phxj_8b1e9f99-746c-4f28-80b9-ea9eb814cd98/node-ca/0.log" Apr 17 17:27:07.994540 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:07.994513 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh"] Apr 17 17:27:07.997338 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:07.997315 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" Apr 17 17:27:07.999831 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:07.999807 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 17:27:07.999944 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:07.999924 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 17:27:08.000011 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:07.999996 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-qr9k4\"" Apr 17 17:27:08.000060 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:08.000022 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 17:27:08.000530 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:08.000517 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:27:08.006726 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:08.006708 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh"] Apr 17 17:27:08.077084 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:08.077054 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95a78b42-cd3a-409d-8ce7-11b8805103c6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-hpbfh\" (UID: \"95a78b42-cd3a-409d-8ce7-11b8805103c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" Apr 17 17:27:08.077243 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:08.077152 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95a78b42-cd3a-409d-8ce7-11b8805103c6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-hpbfh\" (UID: \"95a78b42-cd3a-409d-8ce7-11b8805103c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" Apr 17 17:27:08.077243 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:08.077170 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdsgh\" (UniqueName: \"kubernetes.io/projected/95a78b42-cd3a-409d-8ce7-11b8805103c6-kube-api-access-bdsgh\") pod \"kube-storage-version-migrator-operator-6769c5d45-hpbfh\" (UID: \"95a78b42-cd3a-409d-8ce7-11b8805103c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" Apr 17 17:27:08.178069 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:08.178032 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95a78b42-cd3a-409d-8ce7-11b8805103c6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-hpbfh\" (UID: \"95a78b42-cd3a-409d-8ce7-11b8805103c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" Apr 17 17:27:08.178069 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:08.178070 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdsgh\" (UniqueName: \"kubernetes.io/projected/95a78b42-cd3a-409d-8ce7-11b8805103c6-kube-api-access-bdsgh\") pod \"kube-storage-version-migrator-operator-6769c5d45-hpbfh\" (UID: \"95a78b42-cd3a-409d-8ce7-11b8805103c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" Apr 17 17:27:08.178299 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:08.178229 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95a78b42-cd3a-409d-8ce7-11b8805103c6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-hpbfh\" (UID: \"95a78b42-cd3a-409d-8ce7-11b8805103c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" Apr 17 17:27:08.178540 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:08.178521 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95a78b42-cd3a-409d-8ce7-11b8805103c6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-hpbfh\" (UID: \"95a78b42-cd3a-409d-8ce7-11b8805103c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" Apr 17 17:27:08.180295 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:08.180273 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95a78b42-cd3a-409d-8ce7-11b8805103c6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-hpbfh\" (UID: \"95a78b42-cd3a-409d-8ce7-11b8805103c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" Apr 17 17:27:08.186208 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:08.186191 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdsgh\" (UniqueName: \"kubernetes.io/projected/95a78b42-cd3a-409d-8ce7-11b8805103c6-kube-api-access-bdsgh\") pod \"kube-storage-version-migrator-operator-6769c5d45-hpbfh\" (UID: \"95a78b42-cd3a-409d-8ce7-11b8805103c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" Apr 17 17:27:08.305189 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:08.305163 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" Apr 17 17:27:08.414332 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:08.414303 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh"] Apr 17 17:27:08.417338 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:08.417298 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95a78b42_cd3a_409d_8ce7_11b8805103c6.slice/crio-aa0c2e1a5d70c1cf7cb1afcc8dbaf28172b4d657fa52ce78d015478347644281 WatchSource:0}: Error finding container aa0c2e1a5d70c1cf7cb1afcc8dbaf28172b4d657fa52ce78d015478347644281: Status 404 returned error can't find the container with id aa0c2e1a5d70c1cf7cb1afcc8dbaf28172b4d657fa52ce78d015478347644281 Apr 17 17:27:09.086123 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.086070 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8kt47\" (UID: \"61d0dab8-9f71-4bee-b48b-178b647667dd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:09.086506 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.086167 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:09.086506 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.086217 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:09.086506 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:09.086224 2546 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:27:09.086506 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:09.086284 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls podName:61d0dab8-9f71-4bee-b48b-178b647667dd nodeName:}" failed. No retries permitted until 2026-04-17 17:27:17.08626955 +0000 UTC m=+129.708550203 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8kt47" (UID: "61d0dab8-9f71-4bee-b48b-178b647667dd") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:27:09.086506 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:09.086325 2546 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:27:09.086506 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:09.086341 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle podName:e0ddb199-4f09-4f38-9d09-304ed7807840 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:17.086326037 +0000 UTC m=+129.708606694 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle") pod "router-default-cc85759b6-w8jxm" (UID: "e0ddb199-4f09-4f38-9d09-304ed7807840") : configmap references non-existent config key: service-ca.crt Apr 17 17:27:09.086506 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:09.086366 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs podName:e0ddb199-4f09-4f38-9d09-304ed7807840 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:17.086354634 +0000 UTC m=+129.708635298 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs") pod "router-default-cc85759b6-w8jxm" (UID: "e0ddb199-4f09-4f38-9d09-304ed7807840") : secret "router-metrics-certs-default" not found Apr 17 17:27:09.326869 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.326831 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" event={"ID":"95a78b42-cd3a-409d-8ce7-11b8805103c6","Type":"ContainerStarted","Data":"aa0c2e1a5d70c1cf7cb1afcc8dbaf28172b4d657fa52ce78d015478347644281"} Apr 17 17:27:09.842236 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.842204 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d58f4c77b-xxjhl"] Apr 17 17:27:09.844944 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.844928 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.847462 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.847442 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:27:09.847633 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.847603 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:27:09.847754 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.847644 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:27:09.847871 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.847854 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hlc25\"" Apr 17 17:27:09.853388 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.852984 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:27:09.857749 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.857725 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d58f4c77b-xxjhl"] Apr 17 17:27:09.893205 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.893181 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.893360 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.893254 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-installation-pull-secrets\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.893360 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.893285 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-bound-sa-token\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.893360 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.893303 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-image-registry-private-configuration\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.893360 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.893346 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-certificates\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.893501 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.893366 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9vbk\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-kube-api-access-x9vbk\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.893501 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.893427 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-trusted-ca\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.893501 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.893461 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-ca-trust-extracted\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.994333 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.994275 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-bound-sa-token\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.994499 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.994458 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-image-registry-private-configuration\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.994573 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.994523 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-certificates\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.994573 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.994554 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9vbk\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-kube-api-access-x9vbk\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.994703 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.994652 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-trusted-ca\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.994764 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.994716 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-ca-trust-extracted\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.994764 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.994758 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.994867 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.994833 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-installation-pull-secrets\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.995292 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:09.995068 2546 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:27:09.995292 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:09.995090 2546 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d58f4c77b-xxjhl: secret "image-registry-tls" not found Apr 17 17:27:09.995292 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:09.995154 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls podName:43b8a62b-9f9f-4274-a62a-7a991a1ebb1d nodeName:}" failed. No retries permitted until 2026-04-17 17:27:10.495132766 +0000 UTC m=+123.117413430 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls") pod "image-registry-6d58f4c77b-xxjhl" (UID: "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d") : secret "image-registry-tls" not found Apr 17 17:27:09.995292 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.995189 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-certificates\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.995292 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.995243 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-ca-trust-extracted\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.996006 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.995962 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-trusted-ca\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.997258 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.997233 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-image-registry-private-configuration\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:09.997510 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:09.997493 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-installation-pull-secrets\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:10.004668 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:10.004643 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9vbk\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-kube-api-access-x9vbk\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:10.004668 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:10.004654 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-bound-sa-token\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:10.500223 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:10.500199 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:10.500534 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:10.500344 2546 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:27:10.500534 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:10.500361 2546 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d58f4c77b-xxjhl: secret "image-registry-tls" not found Apr 17 17:27:10.500534 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:10.500412 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls podName:43b8a62b-9f9f-4274-a62a-7a991a1ebb1d nodeName:}" failed. No retries permitted until 2026-04-17 17:27:11.500396734 +0000 UTC m=+124.122677392 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls") pod "image-registry-6d58f4c77b-xxjhl" (UID: "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d") : secret "image-registry-tls" not found Apr 17 17:27:11.332070 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:11.332029 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" event={"ID":"95a78b42-cd3a-409d-8ce7-11b8805103c6","Type":"ContainerStarted","Data":"37a978ac484d36fca169a9b922143fe3a1b2b7cb023f25646b49d0c19a21aa2b"} Apr 17 17:27:11.348877 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:11.348822 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" podStartSLOduration=2.320458444 podStartE2EDuration="4.348803868s" podCreationTimestamp="2026-04-17 17:27:07 +0000 UTC" firstStartedPulling="2026-04-17 17:27:08.419169659 +0000 UTC m=+121.041450312" lastFinishedPulling="2026-04-17 17:27:10.447515084 +0000 UTC m=+123.069795736" observedRunningTime="2026-04-17 17:27:11.348648991 +0000 UTC m=+123.970929667" watchObservedRunningTime="2026-04-17 17:27:11.348803868 +0000 UTC m=+123.971084544" Apr 17 17:27:11.507577 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:11.507537 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:11.507995 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:11.507673 2546 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:27:11.507995 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:11.507714 2546 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d58f4c77b-xxjhl: secret "image-registry-tls" not found Apr 17 17:27:11.507995 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:11.507769 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls podName:43b8a62b-9f9f-4274-a62a-7a991a1ebb1d nodeName:}" failed. No retries permitted until 2026-04-17 17:27:13.507755764 +0000 UTC m=+126.130036418 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls") pod "image-registry-6d58f4c77b-xxjhl" (UID: "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d") : secret "image-registry-tls" not found Apr 17 17:27:13.524329 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:13.524295 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:13.524741 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:13.524461 2546 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:27:13.524741 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:13.524483 2546 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d58f4c77b-xxjhl: secret "image-registry-tls" not found Apr 17 17:27:13.524741 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:13.524546 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls podName:43b8a62b-9f9f-4274-a62a-7a991a1ebb1d nodeName:}" failed. No retries permitted until 2026-04-17 17:27:17.524532103 +0000 UTC m=+130.146812756 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls") pod "image-registry-6d58f4c77b-xxjhl" (UID: "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d") : secret "image-registry-tls" not found Apr 17 17:27:17.157057 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:17.157000 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:17.157057 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:17.157069 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:17.157503 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:17.157095 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8kt47\" (UID: \"61d0dab8-9f71-4bee-b48b-178b647667dd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:17.157503 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:17.157153 2546 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:27:17.157503 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:17.157181 2546 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:27:17.157503 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:17.157233 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs podName:e0ddb199-4f09-4f38-9d09-304ed7807840 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:33.157216917 +0000 UTC m=+145.779497574 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs") pod "router-default-cc85759b6-w8jxm" (UID: "e0ddb199-4f09-4f38-9d09-304ed7807840") : secret "router-metrics-certs-default" not found Apr 17 17:27:17.157503 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:17.157249 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls podName:61d0dab8-9f71-4bee-b48b-178b647667dd nodeName:}" failed. No retries permitted until 2026-04-17 17:27:33.157242646 +0000 UTC m=+145.779523298 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8kt47" (UID: "61d0dab8-9f71-4bee-b48b-178b647667dd") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:27:17.157503 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:17.157263 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle podName:e0ddb199-4f09-4f38-9d09-304ed7807840 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:33.157254363 +0000 UTC m=+145.779535016 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle") pod "router-default-cc85759b6-w8jxm" (UID: "e0ddb199-4f09-4f38-9d09-304ed7807840") : configmap references non-existent config key: service-ca.crt Apr 17 17:27:17.560802 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:17.560769 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:17.561025 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:17.560914 2546 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:27:17.561025 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:17.560936 2546 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d58f4c77b-xxjhl: secret "image-registry-tls" not found Apr 17 17:27:17.561025 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:17.560998 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls podName:43b8a62b-9f9f-4274-a62a-7a991a1ebb1d nodeName:}" failed. No retries permitted until 2026-04-17 17:27:25.560981436 +0000 UTC m=+138.183262094 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls") pod "image-registry-6d58f4c77b-xxjhl" (UID: "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d") : secret "image-registry-tls" not found Apr 17 17:27:17.762608 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:17.762566 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs\") pod \"network-metrics-daemon-fbmql\" (UID: \"173598bb-6dcc-46e9-a78f-f3d5c1fd4297\") " pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:27:17.762819 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:17.762759 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:27:17.762888 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:17.762841 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs podName:173598bb-6dcc-46e9-a78f-f3d5c1fd4297 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:19.76282065 +0000 UTC m=+252.385101304 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs") pod "network-metrics-daemon-fbmql" (UID: "173598bb-6dcc-46e9-a78f-f3d5c1fd4297") : secret "metrics-daemon-secret" not found Apr 17 17:27:25.627141 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:25.627096 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:25.629455 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:25.629433 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls\") pod \"image-registry-6d58f4c77b-xxjhl\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:25.756931 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:25.756900 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:25.877615 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:25.877537 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d58f4c77b-xxjhl"] Apr 17 17:27:25.880078 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:25.880038 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43b8a62b_9f9f_4274_a62a_7a991a1ebb1d.slice/crio-333cea9a026a3f81a7d50243931bdb76c1ed834c2324f6bb0036fcbb8b64fe42 WatchSource:0}: Error finding container 333cea9a026a3f81a7d50243931bdb76c1ed834c2324f6bb0036fcbb8b64fe42: Status 404 returned error can't find the container with id 333cea9a026a3f81a7d50243931bdb76c1ed834c2324f6bb0036fcbb8b64fe42 Apr 17 17:27:26.364301 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:26.362241 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" event={"ID":"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d","Type":"ContainerStarted","Data":"eb8f0268b1830990a0b959b5cbc2eea11baacff6f9afb77fa69dbcee6be880f9"} Apr 17 17:27:26.364301 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:26.362288 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" event={"ID":"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d","Type":"ContainerStarted","Data":"333cea9a026a3f81a7d50243931bdb76c1ed834c2324f6bb0036fcbb8b64fe42"} Apr 17 17:27:26.364301 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:26.362307 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:26.386321 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:26.386271 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" podStartSLOduration=17.386258024 podStartE2EDuration="17.386258024s" podCreationTimestamp="2026-04-17 17:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:27:26.385865437 +0000 UTC m=+139.008146102" watchObservedRunningTime="2026-04-17 17:27:26.386258024 +0000 UTC m=+139.008538698" Apr 17 17:27:33.191592 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:33.191551 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:33.191592 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:33.191601 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:33.192212 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:33.191628 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8kt47\" (UID: \"61d0dab8-9f71-4bee-b48b-178b647667dd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:33.192270 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:33.192250 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ddb199-4f09-4f38-9d09-304ed7807840-service-ca-bundle\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:33.193946 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:33.193922 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/61d0dab8-9f71-4bee-b48b-178b647667dd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8kt47\" (UID: \"61d0dab8-9f71-4bee-b48b-178b647667dd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:33.194075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:33.194052 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0ddb199-4f09-4f38-9d09-304ed7807840-metrics-certs\") pod \"router-default-cc85759b6-w8jxm\" (UID: \"e0ddb199-4f09-4f38-9d09-304ed7807840\") " pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:33.416198 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:33.416160 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-wh4kv\"" Apr 17 17:27:33.420870 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:33.420855 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kvtqb\"" Apr 17 17:27:33.424590 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:33.424577 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" Apr 17 17:27:33.429272 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:33.429241 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:33.550159 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:33.550123 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47"] Apr 17 17:27:33.553090 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:33.553063 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61d0dab8_9f71_4bee_b48b_178b647667dd.slice/crio-878ce219906db8234e92576aecd414afa6196e874b9bac74665de1c419ea44bf WatchSource:0}: Error finding container 878ce219906db8234e92576aecd414afa6196e874b9bac74665de1c419ea44bf: Status 404 returned error can't find the container with id 878ce219906db8234e92576aecd414afa6196e874b9bac74665de1c419ea44bf Apr 17 17:27:33.569279 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:33.569258 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-cc85759b6-w8jxm"] Apr 17 17:27:33.572150 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:33.572120 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ddb199_4f09_4f38_9d09_304ed7807840.slice/crio-3de95d11746b6fe2ed5b678a3a6e837d9b2baf93fef9d0055cceb914eff4e871 WatchSource:0}: Error finding container 3de95d11746b6fe2ed5b678a3a6e837d9b2baf93fef9d0055cceb914eff4e871: Status 404 returned error can't find the container with id 3de95d11746b6fe2ed5b678a3a6e837d9b2baf93fef9d0055cceb914eff4e871 Apr 17 17:27:34.378750 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:34.378710 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-cc85759b6-w8jxm" event={"ID":"e0ddb199-4f09-4f38-9d09-304ed7807840","Type":"ContainerStarted","Data":"ab37b1b80cbec4b8d802a7fef4786e6c842f27a85411d4145be1acbf83eaddf9"} Apr 17 17:27:34.379215 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:34.378758 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-cc85759b6-w8jxm" event={"ID":"e0ddb199-4f09-4f38-9d09-304ed7807840","Type":"ContainerStarted","Data":"3de95d11746b6fe2ed5b678a3a6e837d9b2baf93fef9d0055cceb914eff4e871"} Apr 17 17:27:34.380008 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:34.379982 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" event={"ID":"61d0dab8-9f71-4bee-b48b-178b647667dd","Type":"ContainerStarted","Data":"878ce219906db8234e92576aecd414afa6196e874b9bac74665de1c419ea44bf"} Apr 17 17:27:34.399278 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:34.399231 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-cc85759b6-w8jxm" podStartSLOduration=33.399217789 podStartE2EDuration="33.399217789s" podCreationTimestamp="2026-04-17 17:27:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:27:34.398801498 +0000 UTC m=+147.021082174" watchObservedRunningTime="2026-04-17 17:27:34.399217789 +0000 UTC m=+147.021498463" Apr 17 17:27:34.430326 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:34.430298 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:34.433291 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:34.433267 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:35.389710 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:35.389608 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" event={"ID":"61d0dab8-9f71-4bee-b48b-178b647667dd","Type":"ContainerStarted","Data":"ecd2f9aae8be6307b0ae167b0b541f4197ede11ec2f92e1d34692b5ceeda162b"} Apr 17 17:27:35.390120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:35.389943 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:35.391139 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:35.391120 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-cc85759b6-w8jxm" Apr 17 17:27:35.405588 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:35.405553 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8kt47" podStartSLOduration=32.848839288 podStartE2EDuration="34.405542266s" podCreationTimestamp="2026-04-17 17:27:01 +0000 UTC" firstStartedPulling="2026-04-17 17:27:33.554821044 +0000 UTC m=+146.177101697" lastFinishedPulling="2026-04-17 17:27:35.111524021 +0000 UTC m=+147.733804675" observedRunningTime="2026-04-17 17:27:35.404209336 +0000 UTC m=+148.026490010" watchObservedRunningTime="2026-04-17 17:27:35.405542266 +0000 UTC m=+148.027822937" Apr 17 17:27:36.711648 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.711618 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d58f4c77b-xxjhl"] Apr 17 17:27:36.712693 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.712651 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-m2mcc"] Apr 17 17:27:36.715746 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.715725 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:36.718083 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.718062 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-68nj2\"" Apr 17 17:27:36.718194 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.718093 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:27:36.718428 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.718414 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:27:36.725065 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.725044 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m2mcc"] Apr 17 17:27:36.740868 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.740842 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7cd7cd7b76-mbkvf"] Apr 17 17:27:36.743825 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.743792 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:36.757298 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.757275 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7cd7cd7b76-mbkvf"] Apr 17 17:27:36.819021 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.818984 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b7243405-fea2-48ef-80da-809f729864d2-data-volume\") pod \"insights-runtime-extractor-m2mcc\" (UID: \"b7243405-fea2-48ef-80da-809f729864d2\") " pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:36.819021 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.819022 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b7243405-fea2-48ef-80da-809f729864d2-crio-socket\") pod \"insights-runtime-extractor-m2mcc\" (UID: \"b7243405-fea2-48ef-80da-809f729864d2\") " pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:36.819234 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.819042 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b7243405-fea2-48ef-80da-809f729864d2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m2mcc\" (UID: \"b7243405-fea2-48ef-80da-809f729864d2\") " pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:36.819234 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.819111 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhhjd\" (UniqueName: \"kubernetes.io/projected/b7243405-fea2-48ef-80da-809f729864d2-kube-api-access-lhhjd\") pod \"insights-runtime-extractor-m2mcc\" (UID: \"b7243405-fea2-48ef-80da-809f729864d2\") " pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:36.819234 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.819149 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b7243405-fea2-48ef-80da-809f729864d2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m2mcc\" (UID: \"b7243405-fea2-48ef-80da-809f729864d2\") " pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:36.920402 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.920370 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-registry-tls\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:36.920611 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.920421 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b7243405-fea2-48ef-80da-809f729864d2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m2mcc\" (UID: \"b7243405-fea2-48ef-80da-809f729864d2\") " pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:36.920611 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.920461 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-trusted-ca\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:36.920611 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.920480 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-ca-trust-extracted\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:36.920611 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.920601 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-bound-sa-token\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:36.920863 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.920653 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-registry-certificates\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:36.920863 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.920712 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5djkh\" (UniqueName: \"kubernetes.io/projected/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-kube-api-access-5djkh\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:36.920863 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.920755 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b7243405-fea2-48ef-80da-809f729864d2-data-volume\") pod \"insights-runtime-extractor-m2mcc\" (UID: \"b7243405-fea2-48ef-80da-809f729864d2\") " pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:36.920863 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.920774 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b7243405-fea2-48ef-80da-809f729864d2-crio-socket\") pod \"insights-runtime-extractor-m2mcc\" (UID: \"b7243405-fea2-48ef-80da-809f729864d2\") " pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:36.920863 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.920791 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b7243405-fea2-48ef-80da-809f729864d2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m2mcc\" (UID: \"b7243405-fea2-48ef-80da-809f729864d2\") " pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:36.920863 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.920810 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-image-registry-private-configuration\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:36.920863 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.920827 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-installation-pull-secrets\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:36.921120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.920915 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b7243405-fea2-48ef-80da-809f729864d2-crio-socket\") pod \"insights-runtime-extractor-m2mcc\" (UID: \"b7243405-fea2-48ef-80da-809f729864d2\") " pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:36.921120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.920971 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhhjd\" (UniqueName: \"kubernetes.io/projected/b7243405-fea2-48ef-80da-809f729864d2-kube-api-access-lhhjd\") pod \"insights-runtime-extractor-m2mcc\" (UID: \"b7243405-fea2-48ef-80da-809f729864d2\") " pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:36.921120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.921019 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b7243405-fea2-48ef-80da-809f729864d2-data-volume\") pod \"insights-runtime-extractor-m2mcc\" (UID: \"b7243405-fea2-48ef-80da-809f729864d2\") " pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:36.921301 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.921283 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b7243405-fea2-48ef-80da-809f729864d2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m2mcc\" (UID: \"b7243405-fea2-48ef-80da-809f729864d2\") " pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:36.922887 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.922867 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b7243405-fea2-48ef-80da-809f729864d2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m2mcc\" (UID: \"b7243405-fea2-48ef-80da-809f729864d2\") " pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:36.950215 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:36.950158 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhhjd\" (UniqueName: \"kubernetes.io/projected/b7243405-fea2-48ef-80da-809f729864d2-kube-api-access-lhhjd\") pod \"insights-runtime-extractor-m2mcc\" (UID: \"b7243405-fea2-48ef-80da-809f729864d2\") " pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:37.021210 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.021179 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-ca-trust-extracted\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.021210 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.021209 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-bound-sa-token\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.021412 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.021241 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-registry-certificates\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.021412 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.021261 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5djkh\" (UniqueName: \"kubernetes.io/projected/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-kube-api-access-5djkh\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.021412 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.021286 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-image-registry-private-configuration\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.021412 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.021312 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-installation-pull-secrets\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.021412 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.021386 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-registry-tls\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.021654 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.021457 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-trusted-ca\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.021654 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.021604 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-ca-trust-extracted\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.022125 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.022101 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-registry-certificates\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.022233 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.022204 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-trusted-ca\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.023663 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.023635 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-installation-pull-secrets\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.023790 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.023716 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-image-registry-private-configuration\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.023917 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.023901 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-registry-tls\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.023997 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.023987 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m2mcc" Apr 17 17:27:37.030657 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.030229 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-bound-sa-token\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.030657 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.030601 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5djkh\" (UniqueName: \"kubernetes.io/projected/bf3f4e12-9e99-452e-8ff8-dd441d2d2b39-kube-api-access-5djkh\") pod \"image-registry-7cd7cd7b76-mbkvf\" (UID: \"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39\") " pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.053154 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.053121 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.155525 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.155496 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m2mcc"] Apr 17 17:27:37.158742 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:37.158716 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7243405_fea2_48ef_80da_809f729864d2.slice/crio-2effe7e0edf1f88947e8249917ba0a0e45fd482d5b66cb8b7a702ef1989f5850 WatchSource:0}: Error finding container 2effe7e0edf1f88947e8249917ba0a0e45fd482d5b66cb8b7a702ef1989f5850: Status 404 returned error can't find the container with id 2effe7e0edf1f88947e8249917ba0a0e45fd482d5b66cb8b7a702ef1989f5850 Apr 17 17:27:37.193068 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.193038 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7cd7cd7b76-mbkvf"] Apr 17 17:27:37.196138 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:37.196107 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf3f4e12_9e99_452e_8ff8_dd441d2d2b39.slice/crio-9eb8e1419507b145950e225b6d5dd1cf5f7ee8a09c447072a6816d9f4170a3d7 WatchSource:0}: Error finding container 9eb8e1419507b145950e225b6d5dd1cf5f7ee8a09c447072a6816d9f4170a3d7: Status 404 returned error can't find the container with id 9eb8e1419507b145950e225b6d5dd1cf5f7ee8a09c447072a6816d9f4170a3d7 Apr 17 17:27:37.397512 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.397460 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" event={"ID":"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39","Type":"ContainerStarted","Data":"2599ff06991a582a28147a9eb145f9ad970e30024249ddfedd2b9474da6ee571"} Apr 17 17:27:37.397512 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.397518 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" event={"ID":"bf3f4e12-9e99-452e-8ff8-dd441d2d2b39","Type":"ContainerStarted","Data":"9eb8e1419507b145950e225b6d5dd1cf5f7ee8a09c447072a6816d9f4170a3d7"} Apr 17 17:27:37.397786 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.397595 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:37.402949 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.402913 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m2mcc" event={"ID":"b7243405-fea2-48ef-80da-809f729864d2","Type":"ContainerStarted","Data":"30c1cc2452fb316adc9ae69cbb58f52502594b549c9f02053e88d2d2027481c3"} Apr 17 17:27:37.403077 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.402956 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m2mcc" event={"ID":"b7243405-fea2-48ef-80da-809f729864d2","Type":"ContainerStarted","Data":"2effe7e0edf1f88947e8249917ba0a0e45fd482d5b66cb8b7a702ef1989f5850"} Apr 17 17:27:37.428651 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:37.428603 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" podStartSLOduration=1.428590142 podStartE2EDuration="1.428590142s" podCreationTimestamp="2026-04-17 17:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:27:37.427735231 +0000 UTC m=+150.050015906" watchObservedRunningTime="2026-04-17 17:27:37.428590142 +0000 UTC m=+150.050870817" Apr 17 17:27:38.408226 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:38.408194 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m2mcc" event={"ID":"b7243405-fea2-48ef-80da-809f729864d2","Type":"ContainerStarted","Data":"5dd933093dbd2b6e687af0a7ab6a44e3edd3a994b0aa14b68ba61576e4e2c0be"} Apr 17 17:27:40.415416 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:40.415379 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m2mcc" event={"ID":"b7243405-fea2-48ef-80da-809f729864d2","Type":"ContainerStarted","Data":"4a5b855bfa1bd664a9a3cf03147775245d5b61c17aec197d90ca946c70e2d484"} Apr 17 17:27:40.434294 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:40.434169 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-m2mcc" podStartSLOduration=2.248429567 podStartE2EDuration="4.434150896s" podCreationTimestamp="2026-04-17 17:27:36 +0000 UTC" firstStartedPulling="2026-04-17 17:27:37.214322578 +0000 UTC m=+149.836603232" lastFinishedPulling="2026-04-17 17:27:39.400043906 +0000 UTC m=+152.022324561" observedRunningTime="2026-04-17 17:27:40.43376811 +0000 UTC m=+153.056048784" watchObservedRunningTime="2026-04-17 17:27:40.434150896 +0000 UTC m=+153.056431574" Apr 17 17:27:43.302208 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:43.302163 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-sm5m5" podUID="d82d71b1-2458-4671-b28c-5e3870cd761a" Apr 17 17:27:43.313390 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:43.313343 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-kgnl9" podUID="e4397ebe-1923-4566-89e4-f777e71713b1" Apr 17 17:27:43.421327 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:43.421296 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sm5m5" Apr 17 17:27:44.959277 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:44.959238 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-fbmql" podUID="173598bb-6dcc-46e9-a78f-f3d5c1fd4297" Apr 17 17:27:45.032134 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.032102 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-687v8"] Apr 17 17:27:45.034813 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.034796 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" Apr 17 17:27:45.037163 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.037140 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 17:27:45.037392 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.037377 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:27:45.037914 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.037899 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-jnngb\"" Apr 17 17:27:45.038009 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.037903 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 17:27:45.051865 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.051845 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-687v8"] Apr 17 17:27:45.052753 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.052734 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7qddr"] Apr 17 17:27:45.054584 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.054570 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.056853 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.056834 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 17:27:45.058888 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.058857 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-fll6m\"" Apr 17 17:27:45.059039 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.059024 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 17:27:45.059117 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.059076 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 17:27:45.067232 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.067212 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7qddr"] Apr 17 17:27:45.079801 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.079777 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bcdtv"] Apr 17 17:27:45.081668 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.081648 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.083807 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.083788 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:27:45.083903 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.083820 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-4qm8w\"" Apr 17 17:27:45.083903 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.083821 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:27:45.084002 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.083923 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:27:45.087344 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.087322 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fa27954-0b1c-4018-86c7-30ed361a6229-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-687v8\" (UID: \"9fa27954-0b1c-4018-86c7-30ed361a6229\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" Apr 17 17:27:45.087445 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.087353 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.087445 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.087397 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9fa27954-0b1c-4018-86c7-30ed361a6229-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-687v8\" (UID: \"9fa27954-0b1c-4018-86c7-30ed361a6229\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" Apr 17 17:27:45.087445 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.087425 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.087614 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.087451 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nffh\" (UniqueName: \"kubernetes.io/projected/9fa27954-0b1c-4018-86c7-30ed361a6229-kube-api-access-4nffh\") pod \"openshift-state-metrics-9d44df66c-687v8\" (UID: \"9fa27954-0b1c-4018-86c7-30ed361a6229\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" Apr 17 17:27:45.087614 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.087533 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.087614 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.087574 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.087614 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.087608 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9fa27954-0b1c-4018-86c7-30ed361a6229-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-687v8\" (UID: \"9fa27954-0b1c-4018-86c7-30ed361a6229\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" Apr 17 17:27:45.087928 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.087635 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7tmm\" (UniqueName: \"kubernetes.io/projected/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-kube-api-access-q7tmm\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.087928 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.087734 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.188225 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188181 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fa27954-0b1c-4018-86c7-30ed361a6229-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-687v8\" (UID: \"9fa27954-0b1c-4018-86c7-30ed361a6229\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" Apr 17 17:27:45.188225 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188227 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-tls\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.188424 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188246 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2e480281-dc92-4d6c-99c6-1d7dbd41136d-metrics-client-ca\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.188424 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188270 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.188424 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188338 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2e480281-dc92-4d6c-99c6-1d7dbd41136d-root\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.188424 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:45.188346 2546 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 17:27:45.188424 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188374 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9fa27954-0b1c-4018-86c7-30ed361a6229-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-687v8\" (UID: \"9fa27954-0b1c-4018-86c7-30ed361a6229\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" Apr 17 17:27:45.188424 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188408 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-textfile\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.188424 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:45.188417 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fa27954-0b1c-4018-86c7-30ed361a6229-openshift-state-metrics-tls podName:9fa27954-0b1c-4018-86c7-30ed361a6229 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:45.6884008 +0000 UTC m=+158.310681453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/9fa27954-0b1c-4018-86c7-30ed361a6229-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-687v8" (UID: "9fa27954-0b1c-4018-86c7-30ed361a6229") : secret "openshift-state-metrics-tls" not found Apr 17 17:27:45.188751 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188478 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.188751 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188508 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-accelerators-collector-config\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.188751 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188542 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2e480281-dc92-4d6c-99c6-1d7dbd41136d-sys\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.188751 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188579 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nffh\" (UniqueName: \"kubernetes.io/projected/9fa27954-0b1c-4018-86c7-30ed361a6229-kube-api-access-4nffh\") pod \"openshift-state-metrics-9d44df66c-687v8\" (UID: \"9fa27954-0b1c-4018-86c7-30ed361a6229\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" Apr 17 17:27:45.188751 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188618 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mckj\" (UniqueName: \"kubernetes.io/projected/2e480281-dc92-4d6c-99c6-1d7dbd41136d-kube-api-access-4mckj\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.188751 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188658 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.189055 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188799 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.189055 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188839 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9fa27954-0b1c-4018-86c7-30ed361a6229-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-687v8\" (UID: \"9fa27954-0b1c-4018-86c7-30ed361a6229\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" Apr 17 17:27:45.189055 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188866 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7tmm\" (UniqueName: \"kubernetes.io/projected/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-kube-api-access-q7tmm\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.189055 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188935 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.189055 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188975 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.189055 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.188984 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-wtmp\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.189055 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.189012 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.189324 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:45.189069 2546 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 17:27:45.189324 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:45.189163 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-kube-state-metrics-tls podName:f62beaf3-38e8-43b7-8bda-61534c3eb9a3 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:45.689146134 +0000 UTC m=+158.311426788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-7qddr" (UID: "f62beaf3-38e8-43b7-8bda-61534c3eb9a3") : secret "kube-state-metrics-tls" not found Apr 17 17:27:45.189324 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.189286 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.189619 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.189597 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.189665 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.189644 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9fa27954-0b1c-4018-86c7-30ed361a6229-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-687v8\" (UID: \"9fa27954-0b1c-4018-86c7-30ed361a6229\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" Apr 17 17:27:45.190802 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.190785 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.190932 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.190915 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9fa27954-0b1c-4018-86c7-30ed361a6229-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-687v8\" (UID: \"9fa27954-0b1c-4018-86c7-30ed361a6229\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" Apr 17 17:27:45.197082 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.197063 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nffh\" (UniqueName: \"kubernetes.io/projected/9fa27954-0b1c-4018-86c7-30ed361a6229-kube-api-access-4nffh\") pod \"openshift-state-metrics-9d44df66c-687v8\" (UID: \"9fa27954-0b1c-4018-86c7-30ed361a6229\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" Apr 17 17:27:45.197479 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.197457 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7tmm\" (UniqueName: \"kubernetes.io/projected/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-kube-api-access-q7tmm\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.290129 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.290104 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-wtmp\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.290304 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.290138 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.290304 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.290171 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-tls\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.290304 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.290188 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2e480281-dc92-4d6c-99c6-1d7dbd41136d-metrics-client-ca\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.290304 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.290255 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2e480281-dc92-4d6c-99c6-1d7dbd41136d-root\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.290521 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.290304 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-textfile\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.290521 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.290334 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2e480281-dc92-4d6c-99c6-1d7dbd41136d-root\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.290521 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.290341 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-accelerators-collector-config\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.290521 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.290303 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-wtmp\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.290521 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:45.290403 2546 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:27:45.290521 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.290415 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2e480281-dc92-4d6c-99c6-1d7dbd41136d-sys\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.290521 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.290478 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mckj\" (UniqueName: \"kubernetes.io/projected/2e480281-dc92-4d6c-99c6-1d7dbd41136d-kube-api-access-4mckj\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.290521 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.290481 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2e480281-dc92-4d6c-99c6-1d7dbd41136d-sys\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.290521 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:45.290494 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-tls podName:2e480281-dc92-4d6c-99c6-1d7dbd41136d nodeName:}" failed. No retries permitted until 2026-04-17 17:27:45.79047217 +0000 UTC m=+158.412752823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-tls") pod "node-exporter-bcdtv" (UID: "2e480281-dc92-4d6c-99c6-1d7dbd41136d") : secret "node-exporter-tls" not found Apr 17 17:27:45.290916 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.290632 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-textfile\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.290916 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.290762 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2e480281-dc92-4d6c-99c6-1d7dbd41136d-metrics-client-ca\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.290916 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.290864 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-accelerators-collector-config\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.292483 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.292459 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.299969 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.299942 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mckj\" (UniqueName: \"kubernetes.io/projected/2e480281-dc92-4d6c-99c6-1d7dbd41136d-kube-api-access-4mckj\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.694273 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.694184 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.694273 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.694257 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fa27954-0b1c-4018-86c7-30ed361a6229-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-687v8\" (UID: \"9fa27954-0b1c-4018-86c7-30ed361a6229\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" Apr 17 17:27:45.696590 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.696561 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f62beaf3-38e8-43b7-8bda-61534c3eb9a3-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7qddr\" (UID: \"f62beaf3-38e8-43b7-8bda-61534c3eb9a3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:45.696590 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.696578 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fa27954-0b1c-4018-86c7-30ed361a6229-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-687v8\" (UID: \"9fa27954-0b1c-4018-86c7-30ed361a6229\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" Apr 17 17:27:45.795644 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.795604 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-tls\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:45.795811 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:45.795770 2546 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:27:45.795854 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:45.795832 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-tls podName:2e480281-dc92-4d6c-99c6-1d7dbd41136d nodeName:}" failed. No retries permitted until 2026-04-17 17:27:46.795817043 +0000 UTC m=+159.418097702 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-tls") pod "node-exporter-bcdtv" (UID: "2e480281-dc92-4d6c-99c6-1d7dbd41136d") : secret "node-exporter-tls" not found Apr 17 17:27:45.944388 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.944308 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" Apr 17 17:27:45.963654 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:45.963630 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" Apr 17 17:27:46.099927 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.099895 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-687v8"] Apr 17 17:27:46.102496 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:46.102467 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fa27954_0b1c_4018_86c7_30ed361a6229.slice/crio-0cac75fbb4687192d21a06c366e4974011b5817db9900fbde27c50cc99ce7e65 WatchSource:0}: Error finding container 0cac75fbb4687192d21a06c366e4974011b5817db9900fbde27c50cc99ce7e65: Status 404 returned error can't find the container with id 0cac75fbb4687192d21a06c366e4974011b5817db9900fbde27c50cc99ce7e65 Apr 17 17:27:46.125419 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.125353 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7qddr"] Apr 17 17:27:46.137417 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.137391 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:27:46.138755 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:46.138731 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf62beaf3_38e8_43b7_8bda_61534c3eb9a3.slice/crio-fdcc76eb9c168410fdb12cdeed99b88cb3c7a69da6ba1c224962e17141ca8016 WatchSource:0}: Error finding container fdcc76eb9c168410fdb12cdeed99b88cb3c7a69da6ba1c224962e17141ca8016: Status 404 returned error can't find the container with id fdcc76eb9c168410fdb12cdeed99b88cb3c7a69da6ba1c224962e17141ca8016 Apr 17 17:27:46.141007 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.140990 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.143173 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.143148 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 17:27:46.143312 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.143294 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 17:27:46.143433 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.143409 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 17:27:46.143569 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.143532 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 17:27:46.143569 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.143542 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-fjjzs\"" Apr 17 17:27:46.143724 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.143597 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 17:27:46.143724 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.143631 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 17:27:46.143724 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.143715 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 17:27:46.143984 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.143967 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 17:27:46.144484 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.144466 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 17:27:46.154242 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.154221 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:27:46.198666 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.198635 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d11f87d0-3474-4c6c-a52a-f414b58a875b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.198786 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.198709 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzg9t\" (UniqueName: \"kubernetes.io/projected/d11f87d0-3474-4c6c-a52a-f414b58a875b-kube-api-access-rzg9t\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.198786 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.198746 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.198912 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.198811 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d11f87d0-3474-4c6c-a52a-f414b58a875b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.198912 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.198883 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.199005 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.198930 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d11f87d0-3474-4c6c-a52a-f414b58a875b-config-out\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.199005 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.198956 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.199096 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.199005 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-config-volume\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.199096 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.199028 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-web-config\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.199096 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.199091 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.199224 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.199122 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d11f87d0-3474-4c6c-a52a-f414b58a875b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.199275 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.199224 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.199275 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.199260 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d11f87d0-3474-4c6c-a52a-f414b58a875b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.300052 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.300020 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.300052 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.300056 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d11f87d0-3474-4c6c-a52a-f414b58a875b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.300293 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.300074 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.300293 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.300089 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d11f87d0-3474-4c6c-a52a-f414b58a875b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.300293 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.300232 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d11f87d0-3474-4c6c-a52a-f414b58a875b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.300293 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.300262 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzg9t\" (UniqueName: \"kubernetes.io/projected/d11f87d0-3474-4c6c-a52a-f414b58a875b-kube-api-access-rzg9t\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.300293 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.300288 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.300543 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.300449 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d11f87d0-3474-4c6c-a52a-f414b58a875b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.300543 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.300500 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.300610 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.300545 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d11f87d0-3474-4c6c-a52a-f414b58a875b-config-out\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.300610 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.300574 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.300610 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.300603 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-config-volume\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.300748 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.300630 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-web-config\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.302910 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.302744 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.302910 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.302779 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.302910 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.302837 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d11f87d0-3474-4c6c-a52a-f414b58a875b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.302910 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.302878 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-web-config\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.303191 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:46.302944 2546 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 17 17:27:46.303191 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:27:46.302999 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-main-tls podName:d11f87d0-3474-4c6c-a52a-f414b58a875b nodeName:}" failed. No retries permitted until 2026-04-17 17:27:46.802980941 +0000 UTC m=+159.425261598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "d11f87d0-3474-4c6c-a52a-f414b58a875b") : secret "alertmanager-main-tls" not found Apr 17 17:27:46.303315 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.303254 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d11f87d0-3474-4c6c-a52a-f414b58a875b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.303714 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.303628 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.303817 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.303784 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d11f87d0-3474-4c6c-a52a-f414b58a875b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.304582 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.304563 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d11f87d0-3474-4c6c-a52a-f414b58a875b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.305590 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.305572 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d11f87d0-3474-4c6c-a52a-f414b58a875b-config-out\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.305844 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.305824 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.306263 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.306243 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-config-volume\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.313166 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.313145 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzg9t\" (UniqueName: \"kubernetes.io/projected/d11f87d0-3474-4c6c-a52a-f414b58a875b-kube-api-access-rzg9t\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.429695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.429633 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" event={"ID":"f62beaf3-38e8-43b7-8bda-61534c3eb9a3","Type":"ContainerStarted","Data":"fdcc76eb9c168410fdb12cdeed99b88cb3c7a69da6ba1c224962e17141ca8016"} Apr 17 17:27:46.431262 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.431237 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" event={"ID":"9fa27954-0b1c-4018-86c7-30ed361a6229","Type":"ContainerStarted","Data":"3b2a9a0287020c69463f8afe915ae2e7a49c1db494d59d276eff5f1b1a5dceb7"} Apr 17 17:27:46.431382 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.431269 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" event={"ID":"9fa27954-0b1c-4018-86c7-30ed361a6229","Type":"ContainerStarted","Data":"cc706abed8d3fbc955fbc092975166b4e8a9938e6275cc7d5496d4ac645ef2dd"} Apr 17 17:27:46.431382 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.431284 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" event={"ID":"9fa27954-0b1c-4018-86c7-30ed361a6229","Type":"ContainerStarted","Data":"0cac75fbb4687192d21a06c366e4974011b5817db9900fbde27c50cc99ce7e65"} Apr 17 17:27:46.716898 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.716862 2546 patch_prober.go:28] interesting pod/image-registry-6d58f4c77b-xxjhl container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:27:46.717065 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.716917 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" podUID="43b8a62b-9f9f-4274-a62a-7a991a1ebb1d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:27:46.806814 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.806775 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.807001 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.806886 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-tls\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:46.810315 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.810284 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2e480281-dc92-4d6c-99c6-1d7dbd41136d-node-exporter-tls\") pod \"node-exporter-bcdtv\" (UID: \"2e480281-dc92-4d6c-99c6-1d7dbd41136d\") " pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:46.810946 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.810925 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:46.889988 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:46.889950 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bcdtv" Apr 17 17:27:46.900578 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:46.900530 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e480281_dc92_4d6c_99c6_1d7dbd41136d.slice/crio-14cd3d6c283553eb05afc25a349483cd0b3828d172875a6d1c394cac98f4b7f6 WatchSource:0}: Error finding container 14cd3d6c283553eb05afc25a349483cd0b3828d172875a6d1c394cac98f4b7f6: Status 404 returned error can't find the container with id 14cd3d6c283553eb05afc25a349483cd0b3828d172875a6d1c394cac98f4b7f6 Apr 17 17:27:47.062145 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:47.062112 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:27:47.439768 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:47.439662 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bcdtv" event={"ID":"2e480281-dc92-4d6c-99c6-1d7dbd41136d","Type":"ContainerStarted","Data":"14cd3d6c283553eb05afc25a349483cd0b3828d172875a6d1c394cac98f4b7f6"} Apr 17 17:27:47.655839 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:47.655777 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:27:47.665560 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:47.665523 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11f87d0_3474_4c6c_a52a_f414b58a875b.slice/crio-c83070f11beed6f7f251e1020d41010db0be307aa0a402b0743171ae2d3c0143 WatchSource:0}: Error finding container c83070f11beed6f7f251e1020d41010db0be307aa0a402b0743171ae2d3c0143: Status 404 returned error can't find the container with id c83070f11beed6f7f251e1020d41010db0be307aa0a402b0743171ae2d3c0143 Apr 17 17:27:48.119983 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.119956 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:27:48.122886 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.122861 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d82d71b1-2458-4671-b28c-5e3870cd761a-metrics-tls\") pod \"dns-default-sm5m5\" (UID: \"d82d71b1-2458-4671-b28c-5e3870cd761a\") " pod="openshift-dns/dns-default-sm5m5" Apr 17 17:27:48.221072 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.221031 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert\") pod \"ingress-canary-kgnl9\" (UID: \"e4397ebe-1923-4566-89e4-f777e71713b1\") " pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:27:48.224065 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.224003 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4397ebe-1923-4566-89e4-f777e71713b1-cert\") pod \"ingress-canary-kgnl9\" (UID: \"e4397ebe-1923-4566-89e4-f777e71713b1\") " pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:27:48.224203 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.224107 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w4qb9\"" Apr 17 17:27:48.232769 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.232722 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sm5m5" Apr 17 17:27:48.373592 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.373551 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sm5m5"] Apr 17 17:27:48.377047 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:48.377011 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd82d71b1_2458_4671_b28c_5e3870cd761a.slice/crio-30f4d4d9866ae41a28e9d310d56d76e75966852d0c216bfe454535053b4b05ed WatchSource:0}: Error finding container 30f4d4d9866ae41a28e9d310d56d76e75966852d0c216bfe454535053b4b05ed: Status 404 returned error can't find the container with id 30f4d4d9866ae41a28e9d310d56d76e75966852d0c216bfe454535053b4b05ed Apr 17 17:27:48.443731 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.443698 2546 generic.go:358] "Generic (PLEG): container finished" podID="2e480281-dc92-4d6c-99c6-1d7dbd41136d" containerID="0396f58fe9f48725d48c2fd007976cdec5646e015159277a19a673a788e6de1c" exitCode=0 Apr 17 17:27:48.443919 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.443771 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bcdtv" event={"ID":"2e480281-dc92-4d6c-99c6-1d7dbd41136d","Type":"ContainerDied","Data":"0396f58fe9f48725d48c2fd007976cdec5646e015159277a19a673a788e6de1c"} Apr 17 17:27:48.445036 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.445002 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sm5m5" event={"ID":"d82d71b1-2458-4671-b28c-5e3870cd761a","Type":"ContainerStarted","Data":"30f4d4d9866ae41a28e9d310d56d76e75966852d0c216bfe454535053b4b05ed"} Apr 17 17:27:48.446882 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.446854 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" event={"ID":"f62beaf3-38e8-43b7-8bda-61534c3eb9a3","Type":"ContainerStarted","Data":"44ae5eaf8195faf07f45917d7c39118da625d1c6dae6e518ae4042861cf3594f"} Apr 17 17:27:48.447036 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.446890 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" event={"ID":"f62beaf3-38e8-43b7-8bda-61534c3eb9a3","Type":"ContainerStarted","Data":"a6cb204106841254ea0c06188fe19d01d307ea359cbe066522984fb94b0aef00"} Apr 17 17:27:48.447036 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.446906 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" event={"ID":"f62beaf3-38e8-43b7-8bda-61534c3eb9a3","Type":"ContainerStarted","Data":"812c66f0fe0f2089608c949f236ba52af2ed5ef51c9f809546a5cfae726edf06"} Apr 17 17:27:48.448952 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.448927 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" event={"ID":"9fa27954-0b1c-4018-86c7-30ed361a6229","Type":"ContainerStarted","Data":"e5bba93d7a3e9ae7c5bae7ffb356676cd4c6739bee259e29c104a25167008808"} Apr 17 17:27:48.450095 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.450071 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d11f87d0-3474-4c6c-a52a-f414b58a875b","Type":"ContainerStarted","Data":"c83070f11beed6f7f251e1020d41010db0be307aa0a402b0743171ae2d3c0143"} Apr 17 17:27:48.483380 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.483333 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-7qddr" podStartSLOduration=2.117936328 podStartE2EDuration="3.483319751s" podCreationTimestamp="2026-04-17 17:27:45 +0000 UTC" firstStartedPulling="2026-04-17 17:27:46.140869805 +0000 UTC m=+158.763150473" lastFinishedPulling="2026-04-17 17:27:47.506253234 +0000 UTC m=+160.128533896" observedRunningTime="2026-04-17 17:27:48.482035753 +0000 UTC m=+161.104316424" watchObservedRunningTime="2026-04-17 17:27:48.483319751 +0000 UTC m=+161.105600426" Apr 17 17:27:48.500275 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.500220 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-687v8" podStartSLOduration=2.231960568 podStartE2EDuration="3.500205731s" podCreationTimestamp="2026-04-17 17:27:45 +0000 UTC" firstStartedPulling="2026-04-17 17:27:46.240664844 +0000 UTC m=+158.862945498" lastFinishedPulling="2026-04-17 17:27:47.508910002 +0000 UTC m=+160.131190661" observedRunningTime="2026-04-17 17:27:48.498665151 +0000 UTC m=+161.120945827" watchObservedRunningTime="2026-04-17 17:27:48.500205731 +0000 UTC m=+161.122486405" Apr 17 17:27:48.629546 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.629520 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f76f5bff9-96p8w"] Apr 17 17:27:48.633218 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.633202 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.636084 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.635898 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 17:27:48.636084 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.635995 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 17:27:48.637128 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.636467 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 17:27:48.637232 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.637189 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 17:27:48.639471 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.638847 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 17:27:48.641732 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.641697 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-wgtt5\"" Apr 17 17:27:48.642099 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.642080 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 17:27:48.642561 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.642541 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 17:27:48.644852 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.644827 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f76f5bff9-96p8w"] Apr 17 17:27:48.646385 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.646363 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 17:27:48.726125 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.725970 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-serving-cert\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.726125 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.726033 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlwck\" (UniqueName: \"kubernetes.io/projected/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-kube-api-access-hlwck\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.726125 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.726065 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-config\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.726320 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.726213 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-trusted-ca-bundle\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.726320 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.726250 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-oauth-config\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.726320 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.726292 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-service-ca\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.726320 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.726317 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-oauth-serving-cert\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.827734 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.827694 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-service-ca\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.827734 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.827730 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-oauth-serving-cert\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.827944 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.827764 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-serving-cert\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.827944 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.827800 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlwck\" (UniqueName: \"kubernetes.io/projected/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-kube-api-access-hlwck\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.827944 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.827825 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-config\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.828087 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.828035 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-trusted-ca-bundle\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.828087 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.828074 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-oauth-config\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.828505 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.828477 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-service-ca\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.828610 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.828504 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-config\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.828610 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.828508 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-oauth-serving-cert\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.828942 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.828920 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-trusted-ca-bundle\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.830709 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.830690 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-serving-cert\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.830779 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.830711 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-oauth-config\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.835277 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.835260 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlwck\" (UniqueName: \"kubernetes.io/projected/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-kube-api-access-hlwck\") pod \"console-7f76f5bff9-96p8w\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:48.947224 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:48.947140 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:49.084469 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.084429 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f76f5bff9-96p8w"] Apr 17 17:27:49.087439 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:49.087407 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22e1d5eb_2a9e_4dc9_9d2a_97bf8460f39f.slice/crio-80f92b104bd576fa1da922f98e39fd80fbad206f30bd6291a10fb8890beb7ac3 WatchSource:0}: Error finding container 80f92b104bd576fa1da922f98e39fd80fbad206f30bd6291a10fb8890beb7ac3: Status 404 returned error can't find the container with id 80f92b104bd576fa1da922f98e39fd80fbad206f30bd6291a10fb8890beb7ac3 Apr 17 17:27:49.447010 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.446974 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-567bdb595f-b8hpp"] Apr 17 17:27:49.450268 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.450243 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.453569 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.453545 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 17:27:49.453722 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.453602 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-17ps48jrfacnv\"" Apr 17 17:27:49.453722 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.453546 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 17:27:49.453722 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.453603 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-brp7c\"" Apr 17 17:27:49.453933 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.453544 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 17:27:49.453933 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.453617 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 17:27:49.454830 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.454806 2546 generic.go:358] "Generic (PLEG): container finished" podID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerID="ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee" exitCode=0 Apr 17 17:27:49.454943 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.454900 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d11f87d0-3474-4c6c-a52a-f414b58a875b","Type":"ContainerDied","Data":"ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee"} Apr 17 17:27:49.457876 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.457844 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bcdtv" event={"ID":"2e480281-dc92-4d6c-99c6-1d7dbd41136d","Type":"ContainerStarted","Data":"b2a5f0309ae61b4b7a73290c7120d5011fca696af2f2f1278c24858ee7cd484e"} Apr 17 17:27:49.457876 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.457874 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bcdtv" event={"ID":"2e480281-dc92-4d6c-99c6-1d7dbd41136d","Type":"ContainerStarted","Data":"10af7ee1762a8b338e0295ce0ec8e930c10162720e5df869f9d12960105fcfc7"} Apr 17 17:27:49.459238 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.459210 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-567bdb595f-b8hpp"] Apr 17 17:27:49.459989 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.459962 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f76f5bff9-96p8w" event={"ID":"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f","Type":"ContainerStarted","Data":"80f92b104bd576fa1da922f98e39fd80fbad206f30bd6291a10fb8890beb7ac3"} Apr 17 17:27:49.526127 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.526081 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bcdtv" podStartSLOduration=3.362540724 podStartE2EDuration="4.526065982s" podCreationTimestamp="2026-04-17 17:27:45 +0000 UTC" firstStartedPulling="2026-04-17 17:27:46.902666625 +0000 UTC m=+159.524947282" lastFinishedPulling="2026-04-17 17:27:48.066191887 +0000 UTC m=+160.688472540" observedRunningTime="2026-04-17 17:27:49.524418685 +0000 UTC m=+162.146699361" watchObservedRunningTime="2026-04-17 17:27:49.526065982 +0000 UTC m=+162.148346664" Apr 17 17:27:49.536331 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.536299 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aff002da-4836-41df-8ef7-6b2c94ee451b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.536500 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.536377 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/aff002da-4836-41df-8ef7-6b2c94ee451b-metrics-server-audit-profiles\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.536500 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.536479 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/aff002da-4836-41df-8ef7-6b2c94ee451b-secret-metrics-server-tls\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.536638 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.536587 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/aff002da-4836-41df-8ef7-6b2c94ee451b-audit-log\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.536638 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.536618 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff002da-4836-41df-8ef7-6b2c94ee451b-client-ca-bundle\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.537081 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.536883 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c27mx\" (UniqueName: \"kubernetes.io/projected/aff002da-4836-41df-8ef7-6b2c94ee451b-kube-api-access-c27mx\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.537081 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.537033 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/aff002da-4836-41df-8ef7-6b2c94ee451b-secret-metrics-server-client-certs\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.638649 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.638605 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/aff002da-4836-41df-8ef7-6b2c94ee451b-secret-metrics-server-client-certs\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.638850 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.638729 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aff002da-4836-41df-8ef7-6b2c94ee451b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.638850 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.638764 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/aff002da-4836-41df-8ef7-6b2c94ee451b-metrics-server-audit-profiles\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.638850 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.638829 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/aff002da-4836-41df-8ef7-6b2c94ee451b-secret-metrics-server-tls\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.639001 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.638874 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/aff002da-4836-41df-8ef7-6b2c94ee451b-audit-log\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.639001 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.638899 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff002da-4836-41df-8ef7-6b2c94ee451b-client-ca-bundle\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.639001 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.638937 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c27mx\" (UniqueName: \"kubernetes.io/projected/aff002da-4836-41df-8ef7-6b2c94ee451b-kube-api-access-c27mx\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.639391 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.639354 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/aff002da-4836-41df-8ef7-6b2c94ee451b-audit-log\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.639579 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.639551 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aff002da-4836-41df-8ef7-6b2c94ee451b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.639892 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.639866 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/aff002da-4836-41df-8ef7-6b2c94ee451b-metrics-server-audit-profiles\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.641828 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.641804 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/aff002da-4836-41df-8ef7-6b2c94ee451b-secret-metrics-server-client-certs\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.642073 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.642052 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff002da-4836-41df-8ef7-6b2c94ee451b-client-ca-bundle\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.642567 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.642546 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/aff002da-4836-41df-8ef7-6b2c94ee451b-secret-metrics-server-tls\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.648058 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.648033 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c27mx\" (UniqueName: \"kubernetes.io/projected/aff002da-4836-41df-8ef7-6b2c94ee451b-kube-api-access-c27mx\") pod \"metrics-server-567bdb595f-b8hpp\" (UID: \"aff002da-4836-41df-8ef7-6b2c94ee451b\") " pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.763390 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.763353 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:27:49.826054 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.826014 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xgv4w"] Apr 17 17:27:49.831336 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.831317 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xgv4w" Apr 17 17:27:49.833965 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.833941 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-6fkxj\"" Apr 17 17:27:49.834116 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.833967 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 17:27:49.843726 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.843655 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xgv4w"] Apr 17 17:27:49.915289 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.912949 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-567bdb595f-b8hpp"] Apr 17 17:27:49.920911 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:49.920869 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaff002da_4836_41df_8ef7_6b2c94ee451b.slice/crio-63b54ec38395ce2d78ecc21f56d4933d0a8e05b6f331b86577fcb575c4d3700c WatchSource:0}: Error finding container 63b54ec38395ce2d78ecc21f56d4933d0a8e05b6f331b86577fcb575c4d3700c: Status 404 returned error can't find the container with id 63b54ec38395ce2d78ecc21f56d4933d0a8e05b6f331b86577fcb575c4d3700c Apr 17 17:27:49.941320 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:49.941291 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0a6d5ca8-e84c-43e3-b204-3b6fd91ac786-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xgv4w\" (UID: \"0a6d5ca8-e84c-43e3-b204-3b6fd91ac786\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xgv4w" Apr 17 17:27:50.042872 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:50.042821 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0a6d5ca8-e84c-43e3-b204-3b6fd91ac786-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xgv4w\" (UID: \"0a6d5ca8-e84c-43e3-b204-3b6fd91ac786\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xgv4w" Apr 17 17:27:50.045733 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:50.045663 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0a6d5ca8-e84c-43e3-b204-3b6fd91ac786-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xgv4w\" (UID: \"0a6d5ca8-e84c-43e3-b204-3b6fd91ac786\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xgv4w" Apr 17 17:27:50.141716 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:50.141664 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xgv4w" Apr 17 17:27:50.288108 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:50.288070 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xgv4w"] Apr 17 17:27:50.291159 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:50.291130 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a6d5ca8_e84c_43e3_b204_3b6fd91ac786.slice/crio-c38db208f9778149a8514cf207c61faec634400f7b6c718b58692795534b4f19 WatchSource:0}: Error finding container c38db208f9778149a8514cf207c61faec634400f7b6c718b58692795534b4f19: Status 404 returned error can't find the container with id c38db208f9778149a8514cf207c61faec634400f7b6c718b58692795534b4f19 Apr 17 17:27:50.465231 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:50.465189 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xgv4w" event={"ID":"0a6d5ca8-e84c-43e3-b204-3b6fd91ac786","Type":"ContainerStarted","Data":"c38db208f9778149a8514cf207c61faec634400f7b6c718b58692795534b4f19"} Apr 17 17:27:50.467108 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:50.467047 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sm5m5" event={"ID":"d82d71b1-2458-4671-b28c-5e3870cd761a","Type":"ContainerStarted","Data":"5eb2517ff35503d9869d42eee043566bd0707e118ad0129d43858d9546590f9f"} Apr 17 17:27:50.467108 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:50.467083 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sm5m5" event={"ID":"d82d71b1-2458-4671-b28c-5e3870cd761a","Type":"ContainerStarted","Data":"3495ec641183ec8f4ee03b403dd6420c6fc79a1ae975cfa09be122087affd4d1"} Apr 17 17:27:50.467519 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:50.467312 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-sm5m5" Apr 17 17:27:50.468854 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:50.468824 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" event={"ID":"aff002da-4836-41df-8ef7-6b2c94ee451b","Type":"ContainerStarted","Data":"63b54ec38395ce2d78ecc21f56d4933d0a8e05b6f331b86577fcb575c4d3700c"} Apr 17 17:27:50.484296 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:50.484227 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sm5m5" podStartSLOduration=129.149904427 podStartE2EDuration="2m10.48421143s" podCreationTimestamp="2026-04-17 17:25:40 +0000 UTC" firstStartedPulling="2026-04-17 17:27:48.379100039 +0000 UTC m=+161.001380692" lastFinishedPulling="2026-04-17 17:27:49.713407041 +0000 UTC m=+162.335687695" observedRunningTime="2026-04-17 17:27:50.483749712 +0000 UTC m=+163.106030388" watchObservedRunningTime="2026-04-17 17:27:50.48421143 +0000 UTC m=+163.106492103" Apr 17 17:27:51.475436 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:51.475386 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d11f87d0-3474-4c6c-a52a-f414b58a875b","Type":"ContainerStarted","Data":"26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934"} Apr 17 17:27:51.475880 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:51.475444 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d11f87d0-3474-4c6c-a52a-f414b58a875b","Type":"ContainerStarted","Data":"2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9"} Apr 17 17:27:51.475880 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:51.475461 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d11f87d0-3474-4c6c-a52a-f414b58a875b","Type":"ContainerStarted","Data":"1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614"} Apr 17 17:27:53.482199 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:53.482163 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f76f5bff9-96p8w" event={"ID":"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f","Type":"ContainerStarted","Data":"143b36e304c90fdb6c9279bd1b9694e538d14deac61caae778a0ae9966392f7c"} Apr 17 17:27:53.483601 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:53.483575 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" event={"ID":"aff002da-4836-41df-8ef7-6b2c94ee451b","Type":"ContainerStarted","Data":"0609e36589c3067aaeb5399962599ddee9bc69bd9c19bfad38e0791ac6bb9efb"} Apr 17 17:27:53.486411 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:53.486389 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d11f87d0-3474-4c6c-a52a-f414b58a875b","Type":"ContainerStarted","Data":"dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61"} Apr 17 17:27:53.486502 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:53.486418 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d11f87d0-3474-4c6c-a52a-f414b58a875b","Type":"ContainerStarted","Data":"2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915"} Apr 17 17:27:53.487642 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:53.487621 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xgv4w" event={"ID":"0a6d5ca8-e84c-43e3-b204-3b6fd91ac786","Type":"ContainerStarted","Data":"a51c11e8cab1d588f3be0fe14ed9e349e695d7b5a75155f905bf7f6aeab6dfc4"} Apr 17 17:27:53.487825 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:53.487802 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xgv4w" Apr 17 17:27:53.492546 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:53.492531 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xgv4w" Apr 17 17:27:53.502301 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:53.502252 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f76f5bff9-96p8w" podStartSLOduration=1.558883268 podStartE2EDuration="5.502238882s" podCreationTimestamp="2026-04-17 17:27:48 +0000 UTC" firstStartedPulling="2026-04-17 17:27:49.089434607 +0000 UTC m=+161.711715263" lastFinishedPulling="2026-04-17 17:27:53.03279022 +0000 UTC m=+165.655070877" observedRunningTime="2026-04-17 17:27:53.501185511 +0000 UTC m=+166.123466186" watchObservedRunningTime="2026-04-17 17:27:53.502238882 +0000 UTC m=+166.124519559" Apr 17 17:27:53.516913 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:53.516872 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xgv4w" podStartSLOduration=1.779068323 podStartE2EDuration="4.516862728s" podCreationTimestamp="2026-04-17 17:27:49 +0000 UTC" firstStartedPulling="2026-04-17 17:27:50.293915574 +0000 UTC m=+162.916196230" lastFinishedPulling="2026-04-17 17:27:53.031709976 +0000 UTC m=+165.653990635" observedRunningTime="2026-04-17 17:27:53.515889061 +0000 UTC m=+166.138169737" watchObservedRunningTime="2026-04-17 17:27:53.516862728 +0000 UTC m=+166.139143404" Apr 17 17:27:53.532620 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:53.532584 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" podStartSLOduration=1.4276115 podStartE2EDuration="4.532571836s" podCreationTimestamp="2026-04-17 17:27:49 +0000 UTC" firstStartedPulling="2026-04-17 17:27:49.924046619 +0000 UTC m=+162.546327272" lastFinishedPulling="2026-04-17 17:27:53.02900695 +0000 UTC m=+165.651287608" observedRunningTime="2026-04-17 17:27:53.531759749 +0000 UTC m=+166.154040424" watchObservedRunningTime="2026-04-17 17:27:53.532571836 +0000 UTC m=+166.154852512" Apr 17 17:27:54.493937 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:54.493897 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d11f87d0-3474-4c6c-a52a-f414b58a875b","Type":"ContainerStarted","Data":"8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2"} Apr 17 17:27:54.524869 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:54.524788 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.156493838 podStartE2EDuration="8.524761388s" podCreationTimestamp="2026-04-17 17:27:46 +0000 UTC" firstStartedPulling="2026-04-17 17:27:47.66823593 +0000 UTC m=+160.290516583" lastFinishedPulling="2026-04-17 17:27:54.036503467 +0000 UTC m=+166.658784133" observedRunningTime="2026-04-17 17:27:54.523502354 +0000 UTC m=+167.145783032" watchObservedRunningTime="2026-04-17 17:27:54.524761388 +0000 UTC m=+167.147042063" Apr 17 17:27:54.949194 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:54.949162 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:27:54.951754 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:54.951731 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fhhs9\"" Apr 17 17:27:54.959838 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:54.959819 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kgnl9" Apr 17 17:27:55.074350 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:55.074317 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kgnl9"] Apr 17 17:27:55.077245 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:55.077217 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4397ebe_1923_4566_89e4_f777e71713b1.slice/crio-6aca6ec99934ea8c140b7f4bf55e00689e229b3fa6ddb22fbd52bbb2ba2c05ea WatchSource:0}: Error finding container 6aca6ec99934ea8c140b7f4bf55e00689e229b3fa6ddb22fbd52bbb2ba2c05ea: Status 404 returned error can't find the container with id 6aca6ec99934ea8c140b7f4bf55e00689e229b3fa6ddb22fbd52bbb2ba2c05ea Apr 17 17:27:55.498028 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:55.497992 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kgnl9" event={"ID":"e4397ebe-1923-4566-89e4-f777e71713b1","Type":"ContainerStarted","Data":"6aca6ec99934ea8c140b7f4bf55e00689e229b3fa6ddb22fbd52bbb2ba2c05ea"} Apr 17 17:27:56.072291 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.072250 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8bf588c6c-fjlbl"] Apr 17 17:27:56.075805 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.075777 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.086853 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.086828 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8bf588c6c-fjlbl"] Apr 17 17:27:56.211574 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.211537 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-service-ca\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.211778 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.211618 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca0f866a-8b99-4281-884c-c85069491b36-console-oauth-config\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.211778 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.211665 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-trusted-ca-bundle\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.211778 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.211733 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-oauth-serving-cert\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.211898 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.211782 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-console-config\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.211898 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.211801 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca0f866a-8b99-4281-884c-c85069491b36-console-serving-cert\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.211898 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.211820 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cxg9\" (UniqueName: \"kubernetes.io/projected/ca0f866a-8b99-4281-884c-c85069491b36-kube-api-access-9cxg9\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.312598 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.312560 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-service-ca\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.312771 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.312623 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca0f866a-8b99-4281-884c-c85069491b36-console-oauth-config\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.312771 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.312647 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-trusted-ca-bundle\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.312771 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.312670 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-oauth-serving-cert\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.312771 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.312728 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-console-config\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.312771 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.312753 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca0f866a-8b99-4281-884c-c85069491b36-console-serving-cert\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.313019 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.312790 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cxg9\" (UniqueName: \"kubernetes.io/projected/ca0f866a-8b99-4281-884c-c85069491b36-kube-api-access-9cxg9\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.313475 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.313448 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-console-config\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.313475 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.313467 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-service-ca\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.313731 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.313528 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-oauth-serving-cert\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.314093 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.314075 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-trusted-ca-bundle\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.315097 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.315076 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca0f866a-8b99-4281-884c-c85069491b36-console-serving-cert\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.315207 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.315191 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca0f866a-8b99-4281-884c-c85069491b36-console-oauth-config\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.321266 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.321243 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cxg9\" (UniqueName: \"kubernetes.io/projected/ca0f866a-8b99-4281-884c-c85069491b36-kube-api-access-9cxg9\") pod \"console-8bf588c6c-fjlbl\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.386845 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.386763 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:27:56.716226 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.716148 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:27:56.934856 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.934834 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8bf588c6c-fjlbl"] Apr 17 17:27:56.937428 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:27:56.937408 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca0f866a_8b99_4281_884c_c85069491b36.slice/crio-6bd0cf1c17322c594c1834a2dc8927e061aacc545edbac0255fbe41e9cc27e7a WatchSource:0}: Error finding container 6bd0cf1c17322c594c1834a2dc8927e061aacc545edbac0255fbe41e9cc27e7a: Status 404 returned error can't find the container with id 6bd0cf1c17322c594c1834a2dc8927e061aacc545edbac0255fbe41e9cc27e7a Apr 17 17:27:56.949890 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:56.949865 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:27:57.057265 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:57.057232 2546 patch_prober.go:28] interesting pod/image-registry-7cd7cd7b76-mbkvf container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:27:57.057428 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:57.057295 2546 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" podUID="bf3f4e12-9e99-452e-8ff8-dd441d2d2b39" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:27:57.508935 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:57.508898 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8bf588c6c-fjlbl" event={"ID":"ca0f866a-8b99-4281-884c-c85069491b36","Type":"ContainerStarted","Data":"0188eeac3a3cc39fde9cd9f4619776f714130465c7c2bca25aab0919a256aacd"} Apr 17 17:27:57.508935 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:57.508937 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8bf588c6c-fjlbl" event={"ID":"ca0f866a-8b99-4281-884c-c85069491b36","Type":"ContainerStarted","Data":"6bd0cf1c17322c594c1834a2dc8927e061aacc545edbac0255fbe41e9cc27e7a"} Apr 17 17:27:57.510244 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:57.510221 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kgnl9" event={"ID":"e4397ebe-1923-4566-89e4-f777e71713b1","Type":"ContainerStarted","Data":"0e7e3aa26c7f8724d7201e9f0ce4150294518b8213feb6535015eeb76fb4e8c3"} Apr 17 17:27:57.528241 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:57.528193 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8bf588c6c-fjlbl" podStartSLOduration=1.5281799600000001 podStartE2EDuration="1.52817996s" podCreationTimestamp="2026-04-17 17:27:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:27:57.52666189 +0000 UTC m=+170.148942567" watchObservedRunningTime="2026-04-17 17:27:57.52817996 +0000 UTC m=+170.150460635" Apr 17 17:27:57.542110 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:57.542071 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kgnl9" podStartSLOduration=135.775385756 podStartE2EDuration="2m17.542058862s" podCreationTimestamp="2026-04-17 17:25:40 +0000 UTC" firstStartedPulling="2026-04-17 17:27:55.079055505 +0000 UTC m=+167.701336161" lastFinishedPulling="2026-04-17 17:27:56.8457286 +0000 UTC m=+169.468009267" observedRunningTime="2026-04-17 17:27:57.540776448 +0000 UTC m=+170.163057122" watchObservedRunningTime="2026-04-17 17:27:57.542058862 +0000 UTC m=+170.164339536" Apr 17 17:27:58.412655 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:58.412625 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7cd7cd7b76-mbkvf" Apr 17 17:27:58.948096 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:58.948062 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:58.948096 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:58.948103 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:58.952722 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:58.952698 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:27:59.519868 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:27:59.519842 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:28:00.478157 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:00.478126 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sm5m5" Apr 17 17:28:01.731200 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:01.731123 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" podUID="43b8a62b-9f9f-4274-a62a-7a991a1ebb1d" containerName="registry" containerID="cri-o://eb8f0268b1830990a0b959b5cbc2eea11baacff6f9afb77fa69dbcee6be880f9" gracePeriod=30 Apr 17 17:28:01.964276 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:01.964254 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:28:02.063833 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.063798 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-certificates\") pod \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " Apr 17 17:28:02.064008 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.063864 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-bound-sa-token\") pod \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " Apr 17 17:28:02.064008 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.063881 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-trusted-ca\") pod \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " Apr 17 17:28:02.064008 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.063901 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls\") pod \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " Apr 17 17:28:02.064008 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.063917 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-image-registry-private-configuration\") pod \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " Apr 17 17:28:02.064008 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.063938 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9vbk\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-kube-api-access-x9vbk\") pod \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " Apr 17 17:28:02.064008 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.063985 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-installation-pull-secrets\") pod \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " Apr 17 17:28:02.064301 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.064011 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-ca-trust-extracted\") pod \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\" (UID: \"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d\") " Apr 17 17:28:02.064301 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.064284 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d" (UID: "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:02.064401 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.064293 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d" (UID: "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:02.066343 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.066314 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d" (UID: "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:02.066471 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.066370 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d" (UID: "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:02.066617 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.066592 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d" (UID: "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:02.066737 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.066612 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d" (UID: "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:02.066737 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.066626 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-kube-api-access-x9vbk" (OuterVolumeSpecName: "kube-api-access-x9vbk") pod "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d" (UID: "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d"). InnerVolumeSpecName "kube-api-access-x9vbk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:02.072531 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.072508 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d" (UID: "43b8a62b-9f9f-4274-a62a-7a991a1ebb1d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:28:02.164928 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.164888 2546 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-tls\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:28:02.164928 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.164919 2546 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-image-registry-private-configuration\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:28:02.164928 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.164929 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x9vbk\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-kube-api-access-x9vbk\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:28:02.164928 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.164939 2546 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-installation-pull-secrets\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:28:02.165172 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.164948 2546 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-ca-trust-extracted\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:28:02.165172 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.164957 2546 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-registry-certificates\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:28:02.165172 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.164967 2546 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-bound-sa-token\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:28:02.165172 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.164976 2546 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d-trusted-ca\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:28:02.526479 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.526393 2546 generic.go:358] "Generic (PLEG): container finished" podID="43b8a62b-9f9f-4274-a62a-7a991a1ebb1d" containerID="eb8f0268b1830990a0b959b5cbc2eea11baacff6f9afb77fa69dbcee6be880f9" exitCode=0 Apr 17 17:28:02.526479 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.526466 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" Apr 17 17:28:02.526479 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.526473 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" event={"ID":"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d","Type":"ContainerDied","Data":"eb8f0268b1830990a0b959b5cbc2eea11baacff6f9afb77fa69dbcee6be880f9"} Apr 17 17:28:02.526751 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.526510 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d58f4c77b-xxjhl" event={"ID":"43b8a62b-9f9f-4274-a62a-7a991a1ebb1d","Type":"ContainerDied","Data":"333cea9a026a3f81a7d50243931bdb76c1ed834c2324f6bb0036fcbb8b64fe42"} Apr 17 17:28:02.526751 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.526530 2546 scope.go:117] "RemoveContainer" containerID="eb8f0268b1830990a0b959b5cbc2eea11baacff6f9afb77fa69dbcee6be880f9" Apr 17 17:28:02.534899 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.534880 2546 scope.go:117] "RemoveContainer" containerID="eb8f0268b1830990a0b959b5cbc2eea11baacff6f9afb77fa69dbcee6be880f9" Apr 17 17:28:02.535158 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:28:02.535139 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb8f0268b1830990a0b959b5cbc2eea11baacff6f9afb77fa69dbcee6be880f9\": container with ID starting with eb8f0268b1830990a0b959b5cbc2eea11baacff6f9afb77fa69dbcee6be880f9 not found: ID does not exist" containerID="eb8f0268b1830990a0b959b5cbc2eea11baacff6f9afb77fa69dbcee6be880f9" Apr 17 17:28:02.535216 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.535168 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb8f0268b1830990a0b959b5cbc2eea11baacff6f9afb77fa69dbcee6be880f9"} err="failed to get container status \"eb8f0268b1830990a0b959b5cbc2eea11baacff6f9afb77fa69dbcee6be880f9\": rpc error: code = NotFound desc = could not find container \"eb8f0268b1830990a0b959b5cbc2eea11baacff6f9afb77fa69dbcee6be880f9\": container with ID starting with eb8f0268b1830990a0b959b5cbc2eea11baacff6f9afb77fa69dbcee6be880f9 not found: ID does not exist" Apr 17 17:28:02.547704 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.547667 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d58f4c77b-xxjhl"] Apr 17 17:28:02.552150 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:02.552130 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6d58f4c77b-xxjhl"] Apr 17 17:28:03.956328 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:03.956295 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43b8a62b-9f9f-4274-a62a-7a991a1ebb1d" path="/var/lib/kubelet/pods/43b8a62b-9f9f-4274-a62a-7a991a1ebb1d/volumes" Apr 17 17:28:06.387629 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:06.387598 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:28:06.388050 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:06.387641 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:28:06.392242 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:06.392218 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:28:06.541750 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:06.541724 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:28:06.594137 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:06.594104 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f76f5bff9-96p8w"] Apr 17 17:28:09.764106 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:09.764073 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:28:09.764106 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:09.764117 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:28:24.591820 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:24.591784 2546 generic.go:358] "Generic (PLEG): container finished" podID="e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc" containerID="b9c47d4c4af871ff271e96cd120cf2064cb5e8b95261235922095eb3e37b897c" exitCode=0 Apr 17 17:28:24.592234 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:24.591862 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-99kn4" event={"ID":"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc","Type":"ContainerDied","Data":"b9c47d4c4af871ff271e96cd120cf2064cb5e8b95261235922095eb3e37b897c"} Apr 17 17:28:24.592234 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:24.592206 2546 scope.go:117] "RemoveContainer" containerID="b9c47d4c4af871ff271e96cd120cf2064cb5e8b95261235922095eb3e37b897c" Apr 17 17:28:25.596220 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:25.596190 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-99kn4" event={"ID":"e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc","Type":"ContainerStarted","Data":"b306e7f90af63bf2d14e67dcc07b53f344d8bc06a261bf02b8cc5a5ac8e384c9"} Apr 17 17:28:29.769062 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:29.769026 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:28:29.772835 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:29.772812 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-567bdb595f-b8hpp" Apr 17 17:28:31.613528 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.613476 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7f76f5bff9-96p8w" podUID="22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f" containerName="console" containerID="cri-o://143b36e304c90fdb6c9279bd1b9694e538d14deac61caae778a0ae9966392f7c" gracePeriod=15 Apr 17 17:28:31.859129 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.859109 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f76f5bff9-96p8w_22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f/console/0.log" Apr 17 17:28:31.859229 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.859165 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:28:31.921415 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.921337 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-trusted-ca-bundle\") pod \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " Apr 17 17:28:31.921415 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.921383 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-serving-cert\") pod \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " Apr 17 17:28:31.921415 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.921402 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-config\") pod \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " Apr 17 17:28:31.921672 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.921434 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlwck\" (UniqueName: \"kubernetes.io/projected/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-kube-api-access-hlwck\") pod \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " Apr 17 17:28:31.921672 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.921471 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-service-ca\") pod \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " Apr 17 17:28:31.921672 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.921489 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-oauth-serving-cert\") pod \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " Apr 17 17:28:31.921672 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.921526 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-oauth-config\") pod \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\" (UID: \"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f\") " Apr 17 17:28:31.921905 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.921876 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f" (UID: "22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:31.921905 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.921890 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-config" (OuterVolumeSpecName: "console-config") pod "22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f" (UID: "22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:31.922016 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.921921 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-service-ca" (OuterVolumeSpecName: "service-ca") pod "22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f" (UID: "22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:31.922016 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.921996 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f" (UID: "22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:31.923997 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.923972 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-kube-api-access-hlwck" (OuterVolumeSpecName: "kube-api-access-hlwck") pod "22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f" (UID: "22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f"). InnerVolumeSpecName "kube-api-access-hlwck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:31.924098 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.924002 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f" (UID: "22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:31.924098 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:31.924070 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f" (UID: "22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:32.027066 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.023310 2546 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-oauth-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:28:32.027066 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.023347 2546 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-trusted-ca-bundle\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:28:32.027066 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.023370 2546 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-serving-cert\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:28:32.027066 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.023385 2546 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-console-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:28:32.027066 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.023402 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hlwck\" (UniqueName: \"kubernetes.io/projected/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-kube-api-access-hlwck\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:28:32.027066 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.023418 2546 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-service-ca\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:28:32.027066 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.023437 2546 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f-oauth-serving-cert\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:28:32.615308 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.615280 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f76f5bff9-96p8w_22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f/console/0.log" Apr 17 17:28:32.615730 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.615319 2546 generic.go:358] "Generic (PLEG): container finished" podID="22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f" containerID="143b36e304c90fdb6c9279bd1b9694e538d14deac61caae778a0ae9966392f7c" exitCode=2 Apr 17 17:28:32.615730 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.615380 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f76f5bff9-96p8w" event={"ID":"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f","Type":"ContainerDied","Data":"143b36e304c90fdb6c9279bd1b9694e538d14deac61caae778a0ae9966392f7c"} Apr 17 17:28:32.615730 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.615407 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f76f5bff9-96p8w" event={"ID":"22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f","Type":"ContainerDied","Data":"80f92b104bd576fa1da922f98e39fd80fbad206f30bd6291a10fb8890beb7ac3"} Apr 17 17:28:32.615730 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.615411 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f76f5bff9-96p8w" Apr 17 17:28:32.615730 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.615421 2546 scope.go:117] "RemoveContainer" containerID="143b36e304c90fdb6c9279bd1b9694e538d14deac61caae778a0ae9966392f7c" Apr 17 17:28:32.622850 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.622833 2546 scope.go:117] "RemoveContainer" containerID="143b36e304c90fdb6c9279bd1b9694e538d14deac61caae778a0ae9966392f7c" Apr 17 17:28:32.623128 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:28:32.623111 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"143b36e304c90fdb6c9279bd1b9694e538d14deac61caae778a0ae9966392f7c\": container with ID starting with 143b36e304c90fdb6c9279bd1b9694e538d14deac61caae778a0ae9966392f7c not found: ID does not exist" containerID="143b36e304c90fdb6c9279bd1b9694e538d14deac61caae778a0ae9966392f7c" Apr 17 17:28:32.623170 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.623136 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"143b36e304c90fdb6c9279bd1b9694e538d14deac61caae778a0ae9966392f7c"} err="failed to get container status \"143b36e304c90fdb6c9279bd1b9694e538d14deac61caae778a0ae9966392f7c\": rpc error: code = NotFound desc = could not find container \"143b36e304c90fdb6c9279bd1b9694e538d14deac61caae778a0ae9966392f7c\": container with ID starting with 143b36e304c90fdb6c9279bd1b9694e538d14deac61caae778a0ae9966392f7c not found: ID does not exist" Apr 17 17:28:32.632258 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.632234 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f76f5bff9-96p8w"] Apr 17 17:28:32.636121 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:32.636101 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7f76f5bff9-96p8w"] Apr 17 17:28:33.953657 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:33.953622 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f" path="/var/lib/kubelet/pods/22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f/volumes" Apr 17 17:28:41.646857 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:41.646773 2546 generic.go:358] "Generic (PLEG): container finished" podID="95a78b42-cd3a-409d-8ce7-11b8805103c6" containerID="37a978ac484d36fca169a9b922143fe3a1b2b7cb023f25646b49d0c19a21aa2b" exitCode=0 Apr 17 17:28:41.646857 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:41.646834 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" event={"ID":"95a78b42-cd3a-409d-8ce7-11b8805103c6","Type":"ContainerDied","Data":"37a978ac484d36fca169a9b922143fe3a1b2b7cb023f25646b49d0c19a21aa2b"} Apr 17 17:28:41.647226 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:41.647140 2546 scope.go:117] "RemoveContainer" containerID="37a978ac484d36fca169a9b922143fe3a1b2b7cb023f25646b49d0c19a21aa2b" Apr 17 17:28:42.651214 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:28:42.651181 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hpbfh" event={"ID":"95a78b42-cd3a-409d-8ce7-11b8805103c6","Type":"ContainerStarted","Data":"bd278a84ad473f455212e2c015e07e27018b7189f74741e1fe622af8b5064b27"} Apr 17 17:29:05.311618 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:05.311582 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:29:05.312120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:05.312015 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="alertmanager" containerID="cri-o://1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614" gracePeriod=120 Apr 17 17:29:05.312198 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:05.312078 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="kube-rbac-proxy-metric" containerID="cri-o://dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61" gracePeriod=120 Apr 17 17:29:05.312198 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:05.312103 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="kube-rbac-proxy-web" containerID="cri-o://26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934" gracePeriod=120 Apr 17 17:29:05.312198 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:05.312141 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="config-reloader" containerID="cri-o://2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9" gracePeriod=120 Apr 17 17:29:05.312346 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:05.312192 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="kube-rbac-proxy" containerID="cri-o://2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915" gracePeriod=120 Apr 17 17:29:05.312346 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:05.312179 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="prom-label-proxy" containerID="cri-o://8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2" gracePeriod=120 Apr 17 17:29:05.720280 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:05.720198 2546 generic.go:358] "Generic (PLEG): container finished" podID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerID="8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2" exitCode=0 Apr 17 17:29:05.720280 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:05.720221 2546 generic.go:358] "Generic (PLEG): container finished" podID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerID="2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915" exitCode=0 Apr 17 17:29:05.720280 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:05.720227 2546 generic.go:358] "Generic (PLEG): container finished" podID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerID="2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9" exitCode=0 Apr 17 17:29:05.720280 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:05.720233 2546 generic.go:358] "Generic (PLEG): container finished" podID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerID="1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614" exitCode=0 Apr 17 17:29:05.720280 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:05.720270 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d11f87d0-3474-4c6c-a52a-f414b58a875b","Type":"ContainerDied","Data":"8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2"} Apr 17 17:29:05.720532 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:05.720300 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d11f87d0-3474-4c6c-a52a-f414b58a875b","Type":"ContainerDied","Data":"2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915"} Apr 17 17:29:05.720532 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:05.720313 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d11f87d0-3474-4c6c-a52a-f414b58a875b","Type":"ContainerDied","Data":"2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9"} Apr 17 17:29:05.720532 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:05.720322 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d11f87d0-3474-4c6c-a52a-f414b58a875b","Type":"ContainerDied","Data":"1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614"} Apr 17 17:29:06.552216 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.552192 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.625100 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.625068 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-web-config\") pod \"d11f87d0-3474-4c6c-a52a-f414b58a875b\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " Apr 17 17:29:06.625100 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.625106 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-main-tls\") pod \"d11f87d0-3474-4c6c-a52a-f414b58a875b\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " Apr 17 17:29:06.625335 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.625126 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d11f87d0-3474-4c6c-a52a-f414b58a875b-config-out\") pod \"d11f87d0-3474-4c6c-a52a-f414b58a875b\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " Apr 17 17:29:06.625335 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.625156 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy\") pod \"d11f87d0-3474-4c6c-a52a-f414b58a875b\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " Apr 17 17:29:06.625335 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.625183 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-config-volume\") pod \"d11f87d0-3474-4c6c-a52a-f414b58a875b\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " Apr 17 17:29:06.625335 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.625210 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d11f87d0-3474-4c6c-a52a-f414b58a875b-alertmanager-trusted-ca-bundle\") pod \"d11f87d0-3474-4c6c-a52a-f414b58a875b\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " Apr 17 17:29:06.625335 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.625234 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy-web\") pod \"d11f87d0-3474-4c6c-a52a-f414b58a875b\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " Apr 17 17:29:06.625335 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.625258 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d11f87d0-3474-4c6c-a52a-f414b58a875b-alertmanager-main-db\") pod \"d11f87d0-3474-4c6c-a52a-f414b58a875b\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " Apr 17 17:29:06.625335 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.625314 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d11f87d0-3474-4c6c-a52a-f414b58a875b-tls-assets\") pod \"d11f87d0-3474-4c6c-a52a-f414b58a875b\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " Apr 17 17:29:06.625736 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.625342 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzg9t\" (UniqueName: \"kubernetes.io/projected/d11f87d0-3474-4c6c-a52a-f414b58a875b-kube-api-access-rzg9t\") pod \"d11f87d0-3474-4c6c-a52a-f414b58a875b\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " Apr 17 17:29:06.625736 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.625369 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"d11f87d0-3474-4c6c-a52a-f414b58a875b\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " Apr 17 17:29:06.625736 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.625404 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-cluster-tls-config\") pod \"d11f87d0-3474-4c6c-a52a-f414b58a875b\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " Apr 17 17:29:06.625736 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.625467 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d11f87d0-3474-4c6c-a52a-f414b58a875b-metrics-client-ca\") pod \"d11f87d0-3474-4c6c-a52a-f414b58a875b\" (UID: \"d11f87d0-3474-4c6c-a52a-f414b58a875b\") " Apr 17 17:29:06.626204 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.626151 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d11f87d0-3474-4c6c-a52a-f414b58a875b-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "d11f87d0-3474-4c6c-a52a-f414b58a875b" (UID: "d11f87d0-3474-4c6c-a52a-f414b58a875b"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:06.627240 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.627175 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d11f87d0-3474-4c6c-a52a-f414b58a875b-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "d11f87d0-3474-4c6c-a52a-f414b58a875b" (UID: "d11f87d0-3474-4c6c-a52a-f414b58a875b"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:06.627905 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.627876 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11f87d0-3474-4c6c-a52a-f414b58a875b-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "d11f87d0-3474-4c6c-a52a-f414b58a875b" (UID: "d11f87d0-3474-4c6c-a52a-f414b58a875b"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:29:06.628101 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.628070 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-config-volume" (OuterVolumeSpecName: "config-volume") pod "d11f87d0-3474-4c6c-a52a-f414b58a875b" (UID: "d11f87d0-3474-4c6c-a52a-f414b58a875b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:06.628340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.628321 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11f87d0-3474-4c6c-a52a-f414b58a875b-config-out" (OuterVolumeSpecName: "config-out") pod "d11f87d0-3474-4c6c-a52a-f414b58a875b" (UID: "d11f87d0-3474-4c6c-a52a-f414b58a875b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:29:06.628512 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.628492 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "d11f87d0-3474-4c6c-a52a-f414b58a875b" (UID: "d11f87d0-3474-4c6c-a52a-f414b58a875b"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:06.628733 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.628712 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11f87d0-3474-4c6c-a52a-f414b58a875b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d11f87d0-3474-4c6c-a52a-f414b58a875b" (UID: "d11f87d0-3474-4c6c-a52a-f414b58a875b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:29:06.629386 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.629351 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "d11f87d0-3474-4c6c-a52a-f414b58a875b" (UID: "d11f87d0-3474-4c6c-a52a-f414b58a875b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:06.629493 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.629423 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "d11f87d0-3474-4c6c-a52a-f414b58a875b" (UID: "d11f87d0-3474-4c6c-a52a-f414b58a875b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:06.630171 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.630154 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11f87d0-3474-4c6c-a52a-f414b58a875b-kube-api-access-rzg9t" (OuterVolumeSpecName: "kube-api-access-rzg9t") pod "d11f87d0-3474-4c6c-a52a-f414b58a875b" (UID: "d11f87d0-3474-4c6c-a52a-f414b58a875b"). InnerVolumeSpecName "kube-api-access-rzg9t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:29:06.630490 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.630475 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "d11f87d0-3474-4c6c-a52a-f414b58a875b" (UID: "d11f87d0-3474-4c6c-a52a-f414b58a875b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:06.633061 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.633038 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "d11f87d0-3474-4c6c-a52a-f414b58a875b" (UID: "d11f87d0-3474-4c6c-a52a-f414b58a875b"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:06.638794 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.638774 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-web-config" (OuterVolumeSpecName: "web-config") pod "d11f87d0-3474-4c6c-a52a-f414b58a875b" (UID: "d11f87d0-3474-4c6c-a52a-f414b58a875b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:06.725575 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.725544 2546 generic.go:358] "Generic (PLEG): container finished" podID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerID="dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61" exitCode=0 Apr 17 17:29:06.725575 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.725571 2546 generic.go:358] "Generic (PLEG): container finished" podID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerID="26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934" exitCode=0 Apr 17 17:29:06.725783 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.725596 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d11f87d0-3474-4c6c-a52a-f414b58a875b","Type":"ContainerDied","Data":"dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61"} Apr 17 17:29:06.725783 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.725630 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d11f87d0-3474-4c6c-a52a-f414b58a875b","Type":"ContainerDied","Data":"26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934"} Apr 17 17:29:06.725783 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.725643 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d11f87d0-3474-4c6c-a52a-f414b58a875b","Type":"ContainerDied","Data":"c83070f11beed6f7f251e1020d41010db0be307aa0a402b0743171ae2d3c0143"} Apr 17 17:29:06.725783 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.725654 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.725783 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.725657 2546 scope.go:117] "RemoveContainer" containerID="8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2" Apr 17 17:29:06.726345 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.726196 2546 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-web-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:06.726345 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.726222 2546 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-main-tls\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:06.726345 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.726239 2546 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d11f87d0-3474-4c6c-a52a-f414b58a875b-config-out\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:06.726345 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.726253 2546 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:06.726345 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.726264 2546 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-config-volume\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:06.726345 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.726273 2546 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d11f87d0-3474-4c6c-a52a-f414b58a875b-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:06.726345 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.726285 2546 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:06.726345 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.726300 2546 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d11f87d0-3474-4c6c-a52a-f414b58a875b-alertmanager-main-db\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:06.726345 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.726315 2546 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d11f87d0-3474-4c6c-a52a-f414b58a875b-tls-assets\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:06.726345 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.726331 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rzg9t\" (UniqueName: \"kubernetes.io/projected/d11f87d0-3474-4c6c-a52a-f414b58a875b-kube-api-access-rzg9t\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:06.726345 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.726345 2546 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:06.726798 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.726360 2546 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d11f87d0-3474-4c6c-a52a-f414b58a875b-cluster-tls-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:06.726798 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.726373 2546 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d11f87d0-3474-4c6c-a52a-f414b58a875b-metrics-client-ca\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:06.733853 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.733836 2546 scope.go:117] "RemoveContainer" containerID="dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61" Apr 17 17:29:06.740354 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.740339 2546 scope.go:117] "RemoveContainer" containerID="2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915" Apr 17 17:29:06.746120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.746096 2546 scope.go:117] "RemoveContainer" containerID="26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934" Apr 17 17:29:06.748166 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.748146 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:29:06.752298 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.752280 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:29:06.752610 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.752598 2546 scope.go:117] "RemoveContainer" containerID="2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9" Apr 17 17:29:06.759402 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.759387 2546 scope.go:117] "RemoveContainer" containerID="1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614" Apr 17 17:29:06.765503 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.765485 2546 scope.go:117] "RemoveContainer" containerID="ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee" Apr 17 17:29:06.771371 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.771356 2546 scope.go:117] "RemoveContainer" containerID="8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2" Apr 17 17:29:06.771632 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:29:06.771615 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2\": container with ID starting with 8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2 not found: ID does not exist" containerID="8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2" Apr 17 17:29:06.771692 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.771641 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2"} err="failed to get container status \"8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2\": rpc error: code = NotFound desc = could not find container \"8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2\": container with ID starting with 8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2 not found: ID does not exist" Apr 17 17:29:06.771692 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.771661 2546 scope.go:117] "RemoveContainer" containerID="dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61" Apr 17 17:29:06.771925 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:29:06.771909 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61\": container with ID starting with dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61 not found: ID does not exist" containerID="dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61" Apr 17 17:29:06.771967 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.771933 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61"} err="failed to get container status \"dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61\": rpc error: code = NotFound desc = could not find container \"dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61\": container with ID starting with dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61 not found: ID does not exist" Apr 17 17:29:06.771967 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.771947 2546 scope.go:117] "RemoveContainer" containerID="2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915" Apr 17 17:29:06.772191 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:29:06.772175 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915\": container with ID starting with 2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915 not found: ID does not exist" containerID="2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915" Apr 17 17:29:06.772248 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.772194 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915"} err="failed to get container status \"2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915\": rpc error: code = NotFound desc = could not find container \"2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915\": container with ID starting with 2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915 not found: ID does not exist" Apr 17 17:29:06.772248 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.772208 2546 scope.go:117] "RemoveContainer" containerID="26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934" Apr 17 17:29:06.772410 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:29:06.772394 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934\": container with ID starting with 26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934 not found: ID does not exist" containerID="26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934" Apr 17 17:29:06.772447 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.772415 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934"} err="failed to get container status \"26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934\": rpc error: code = NotFound desc = could not find container \"26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934\": container with ID starting with 26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934 not found: ID does not exist" Apr 17 17:29:06.772447 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.772429 2546 scope.go:117] "RemoveContainer" containerID="2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9" Apr 17 17:29:06.772633 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:29:06.772618 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9\": container with ID starting with 2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9 not found: ID does not exist" containerID="2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9" Apr 17 17:29:06.772698 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.772634 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9"} err="failed to get container status \"2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9\": rpc error: code = NotFound desc = could not find container \"2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9\": container with ID starting with 2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9 not found: ID does not exist" Apr 17 17:29:06.772698 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.772646 2546 scope.go:117] "RemoveContainer" containerID="1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614" Apr 17 17:29:06.772850 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:29:06.772830 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614\": container with ID starting with 1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614 not found: ID does not exist" containerID="1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614" Apr 17 17:29:06.772888 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.772854 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614"} err="failed to get container status \"1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614\": rpc error: code = NotFound desc = could not find container \"1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614\": container with ID starting with 1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614 not found: ID does not exist" Apr 17 17:29:06.772888 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.772869 2546 scope.go:117] "RemoveContainer" containerID="ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee" Apr 17 17:29:06.773066 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:29:06.773052 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee\": container with ID starting with ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee not found: ID does not exist" containerID="ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee" Apr 17 17:29:06.773110 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.773070 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee"} err="failed to get container status \"ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee\": rpc error: code = NotFound desc = could not find container \"ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee\": container with ID starting with ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee not found: ID does not exist" Apr 17 17:29:06.773110 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.773085 2546 scope.go:117] "RemoveContainer" containerID="8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2" Apr 17 17:29:06.773304 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.773286 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2"} err="failed to get container status \"8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2\": rpc error: code = NotFound desc = could not find container \"8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2\": container with ID starting with 8ec403549661733828e8395cdcf16850ec6e3774c56f8afc425a122b65ed1de2 not found: ID does not exist" Apr 17 17:29:06.773358 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.773305 2546 scope.go:117] "RemoveContainer" containerID="dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61" Apr 17 17:29:06.773510 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.773495 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61"} err="failed to get container status \"dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61\": rpc error: code = NotFound desc = could not find container \"dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61\": container with ID starting with dc689ad9ed2860fc77275db40c64c1a9e176c531ebd7525fe05d3f04bd29be61 not found: ID does not exist" Apr 17 17:29:06.773558 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.773511 2546 scope.go:117] "RemoveContainer" containerID="2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915" Apr 17 17:29:06.773714 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.773664 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915"} err="failed to get container status \"2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915\": rpc error: code = NotFound desc = could not find container \"2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915\": container with ID starting with 2f7fed0883233e718322c2fb7bb49766675fbe105a67aefe55395bad3f00b915 not found: ID does not exist" Apr 17 17:29:06.773714 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.773696 2546 scope.go:117] "RemoveContainer" containerID="26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934" Apr 17 17:29:06.773887 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.773867 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934"} err="failed to get container status \"26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934\": rpc error: code = NotFound desc = could not find container \"26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934\": container with ID starting with 26e426137322e280277954ad49b08a8cd301ba8fd4f97a7d3f86a04332a87934 not found: ID does not exist" Apr 17 17:29:06.773953 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.773889 2546 scope.go:117] "RemoveContainer" containerID="2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9" Apr 17 17:29:06.774132 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.774116 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9"} err="failed to get container status \"2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9\": rpc error: code = NotFound desc = could not find container \"2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9\": container with ID starting with 2f66274506c63986177c815f82540cbe789baa213b1da8fa2f2bc7581fde0cb9 not found: ID does not exist" Apr 17 17:29:06.774178 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.774133 2546 scope.go:117] "RemoveContainer" containerID="1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614" Apr 17 17:29:06.774328 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.774311 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614"} err="failed to get container status \"1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614\": rpc error: code = NotFound desc = could not find container \"1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614\": container with ID starting with 1a4711a1d204ecdcadb1eb0a31107702c56ba62581663ae1a6593b404a2c7614 not found: ID does not exist" Apr 17 17:29:06.774387 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.774329 2546 scope.go:117] "RemoveContainer" containerID="ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee" Apr 17 17:29:06.774527 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.774510 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee"} err="failed to get container status \"ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee\": rpc error: code = NotFound desc = could not find container \"ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee\": container with ID starting with ef283b5799330ef925325781c638d1476d1dd5bfefd6f13f3b06fa4b33cf6aee not found: ID does not exist" Apr 17 17:29:06.781206 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781186 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:29:06.781477 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781451 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="kube-rbac-proxy-metric" Apr 17 17:29:06.781517 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781481 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="kube-rbac-proxy-metric" Apr 17 17:29:06.781517 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781494 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="kube-rbac-proxy" Apr 17 17:29:06.781517 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781499 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="kube-rbac-proxy" Apr 17 17:29:06.781517 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781505 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f" containerName="console" Apr 17 17:29:06.781517 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781512 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f" containerName="console" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781521 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="config-reloader" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781526 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="config-reloader" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781533 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="init-config-reloader" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781539 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="init-config-reloader" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781547 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="kube-rbac-proxy-web" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781556 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="kube-rbac-proxy-web" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781564 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43b8a62b-9f9f-4274-a62a-7a991a1ebb1d" containerName="registry" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781569 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b8a62b-9f9f-4274-a62a-7a991a1ebb1d" containerName="registry" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781576 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="alertmanager" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781581 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="alertmanager" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781587 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="prom-label-proxy" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781593 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="prom-label-proxy" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781638 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="alertmanager" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781646 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="43b8a62b-9f9f-4274-a62a-7a991a1ebb1d" containerName="registry" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781654 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="config-reloader" Apr 17 17:29:06.781656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781660 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="kube-rbac-proxy" Apr 17 17:29:06.782323 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781667 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="kube-rbac-proxy-metric" Apr 17 17:29:06.782323 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781673 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="prom-label-proxy" Apr 17 17:29:06.782323 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781707 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" containerName="kube-rbac-proxy-web" Apr 17 17:29:06.782323 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.781717 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="22e1d5eb-2a9e-4dc9-9d2a-97bf8460f39f" containerName="console" Apr 17 17:29:06.785531 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.785516 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.787857 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.787836 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 17:29:06.787949 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.787913 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 17:29:06.788049 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.788026 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 17:29:06.788166 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.788153 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 17:29:06.788232 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.788187 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 17:29:06.788232 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.788190 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 17:29:06.788617 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.788594 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 17:29:06.788714 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.788692 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-fjjzs\"" Apr 17 17:29:06.788714 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.788708 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 17:29:06.793356 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.793338 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 17:29:06.801628 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.801607 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:29:06.827064 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.827033 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.827177 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.827074 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/289f3232-3555-4e14-a1f1-ef291fa65ef9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.827177 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.827097 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/289f3232-3555-4e14-a1f1-ef291fa65ef9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.827177 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.827117 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-web-config\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.827177 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.827144 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvsnn\" (UniqueName: \"kubernetes.io/projected/289f3232-3555-4e14-a1f1-ef291fa65ef9-kube-api-access-zvsnn\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.827326 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.827185 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.827326 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.827205 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.827326 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.827232 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/289f3232-3555-4e14-a1f1-ef291fa65ef9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.827326 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.827248 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/289f3232-3555-4e14-a1f1-ef291fa65ef9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.827326 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.827287 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/289f3232-3555-4e14-a1f1-ef291fa65ef9-config-out\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.827326 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.827308 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.827326 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.827326 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-config-volume\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.827581 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.827392 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.927912 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.927818 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/289f3232-3555-4e14-a1f1-ef291fa65ef9-config-out\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.927912 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.927854 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.927912 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.927877 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-config-volume\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.927912 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.927900 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.928223 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.927940 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.928223 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.927962 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/289f3232-3555-4e14-a1f1-ef291fa65ef9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.928223 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.927986 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/289f3232-3555-4e14-a1f1-ef291fa65ef9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.928223 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.928016 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-web-config\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.928223 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.928046 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvsnn\" (UniqueName: \"kubernetes.io/projected/289f3232-3555-4e14-a1f1-ef291fa65ef9-kube-api-access-zvsnn\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.928223 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.928079 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.928223 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.928105 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.928223 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.928162 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/289f3232-3555-4e14-a1f1-ef291fa65ef9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.928223 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.928191 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/289f3232-3555-4e14-a1f1-ef291fa65ef9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.928629 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.928507 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/289f3232-3555-4e14-a1f1-ef291fa65ef9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.929545 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.929180 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/289f3232-3555-4e14-a1f1-ef291fa65ef9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.931060 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.931033 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-config-volume\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.931165 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.931076 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/289f3232-3555-4e14-a1f1-ef291fa65ef9-config-out\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.931165 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.931122 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.931293 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.931172 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.931293 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.931223 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.931293 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.931226 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/289f3232-3555-4e14-a1f1-ef291fa65ef9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.931444 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.931399 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.931542 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.931523 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/289f3232-3555-4e14-a1f1-ef291fa65ef9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.932140 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.932118 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.932901 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.932885 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/289f3232-3555-4e14-a1f1-ef291fa65ef9-web-config\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:06.939609 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:06.939583 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvsnn\" (UniqueName: \"kubernetes.io/projected/289f3232-3555-4e14-a1f1-ef291fa65ef9-kube-api-access-zvsnn\") pod \"alertmanager-main-0\" (UID: \"289f3232-3555-4e14-a1f1-ef291fa65ef9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:07.095282 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:07.095252 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:29:07.221109 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:07.221029 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:29:07.225224 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:29:07.225197 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod289f3232_3555_4e14_a1f1_ef291fa65ef9.slice/crio-f5d24a535eab60c449791d67842b38e2de93ff4e6e836933e0b52d263862b006 WatchSource:0}: Error finding container f5d24a535eab60c449791d67842b38e2de93ff4e6e836933e0b52d263862b006: Status 404 returned error can't find the container with id f5d24a535eab60c449791d67842b38e2de93ff4e6e836933e0b52d263862b006 Apr 17 17:29:07.734578 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:07.734532 2546 generic.go:358] "Generic (PLEG): container finished" podID="289f3232-3555-4e14-a1f1-ef291fa65ef9" containerID="578d349eb9c044bec13b32ba67351f9245ca37e8a009c67f2ee2f845eecce74f" exitCode=0 Apr 17 17:29:07.735015 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:07.734650 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"289f3232-3555-4e14-a1f1-ef291fa65ef9","Type":"ContainerDied","Data":"578d349eb9c044bec13b32ba67351f9245ca37e8a009c67f2ee2f845eecce74f"} Apr 17 17:29:07.735015 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:07.734720 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"289f3232-3555-4e14-a1f1-ef291fa65ef9","Type":"ContainerStarted","Data":"f5d24a535eab60c449791d67842b38e2de93ff4e6e836933e0b52d263862b006"} Apr 17 17:29:07.953629 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:07.953601 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11f87d0-3474-4c6c-a52a-f414b58a875b" path="/var/lib/kubelet/pods/d11f87d0-3474-4c6c-a52a-f414b58a875b/volumes" Apr 17 17:29:08.741048 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:08.741017 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"289f3232-3555-4e14-a1f1-ef291fa65ef9","Type":"ContainerStarted","Data":"7e90339c46f942f393b18c9d6fe58600a3035fa8ef1346f80f3e4837fbb59fe1"} Apr 17 17:29:08.741048 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:08.741049 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"289f3232-3555-4e14-a1f1-ef291fa65ef9","Type":"ContainerStarted","Data":"6dded1102749e0217197c4a1d4703a60c64e0c605d8c6d23ec7d9dbd915fb74e"} Apr 17 17:29:08.741449 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:08.741060 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"289f3232-3555-4e14-a1f1-ef291fa65ef9","Type":"ContainerStarted","Data":"96ca418f4067681e84c9f7643accd87dfecb13e7a9de5c0adcba78e5922df3ba"} Apr 17 17:29:08.741449 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:08.741068 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"289f3232-3555-4e14-a1f1-ef291fa65ef9","Type":"ContainerStarted","Data":"79f73b68f6b0373336d97f4401fc4b6289a636d229427a599eb74d751b627fe1"} Apr 17 17:29:08.741449 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:08.741077 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"289f3232-3555-4e14-a1f1-ef291fa65ef9","Type":"ContainerStarted","Data":"56a1fbda65560cb5db3577e97e5c802c01b95227b525b106db989a802e381b6b"} Apr 17 17:29:08.741449 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:08.741085 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"289f3232-3555-4e14-a1f1-ef291fa65ef9","Type":"ContainerStarted","Data":"090b74950e7dd5fb6622d9b28a2e4df7fb81de69979e10e9926ce0b2ac0d7105"} Apr 17 17:29:08.771274 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:08.771231 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.771216547 podStartE2EDuration="2.771216547s" podCreationTimestamp="2026-04-17 17:29:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:29:08.768521499 +0000 UTC m=+241.390802186" watchObservedRunningTime="2026-04-17 17:29:08.771216547 +0000 UTC m=+241.393497265" Apr 17 17:29:09.348014 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.347980 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-8486f94db4-5fm4k"] Apr 17 17:29:09.350270 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.350253 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.352697 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.352653 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 17:29:09.352811 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.352725 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 17:29:09.352811 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.352653 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 17:29:09.352811 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.352773 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-zvnks\"" Apr 17 17:29:09.352811 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.352725 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 17:29:09.353019 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.352944 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 17:29:09.358710 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.358666 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 17:29:09.361851 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.361831 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-8486f94db4-5fm4k"] Apr 17 17:29:09.448550 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.448518 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d4df2b4e-6557-426d-b06d-bda244334381-telemeter-client-tls\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.448769 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.448575 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4df2b4e-6557-426d-b06d-bda244334381-metrics-client-ca\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.448769 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.448607 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4df2b4e-6557-426d-b06d-bda244334381-serving-certs-ca-bundle\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.448769 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.448635 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d4df2b4e-6557-426d-b06d-bda244334381-federate-client-tls\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.448898 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.448775 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m9cq\" (UniqueName: \"kubernetes.io/projected/d4df2b4e-6557-426d-b06d-bda244334381-kube-api-access-8m9cq\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.448898 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.448812 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4df2b4e-6557-426d-b06d-bda244334381-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.448898 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.448842 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4df2b4e-6557-426d-b06d-bda244334381-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.449002 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.448892 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d4df2b4e-6557-426d-b06d-bda244334381-secret-telemeter-client\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.549740 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.549670 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8m9cq\" (UniqueName: \"kubernetes.io/projected/d4df2b4e-6557-426d-b06d-bda244334381-kube-api-access-8m9cq\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.549740 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.549752 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4df2b4e-6557-426d-b06d-bda244334381-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.549993 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.549773 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4df2b4e-6557-426d-b06d-bda244334381-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.549993 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.549791 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d4df2b4e-6557-426d-b06d-bda244334381-secret-telemeter-client\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.549993 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.549813 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d4df2b4e-6557-426d-b06d-bda244334381-telemeter-client-tls\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.549993 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.549844 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4df2b4e-6557-426d-b06d-bda244334381-metrics-client-ca\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.549993 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.549878 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4df2b4e-6557-426d-b06d-bda244334381-serving-certs-ca-bundle\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.550212 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.550083 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d4df2b4e-6557-426d-b06d-bda244334381-federate-client-tls\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.550669 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.550591 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4df2b4e-6557-426d-b06d-bda244334381-serving-certs-ca-bundle\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.550797 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.550718 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4df2b4e-6557-426d-b06d-bda244334381-metrics-client-ca\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.550797 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.550759 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4df2b4e-6557-426d-b06d-bda244334381-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.552416 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.552388 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d4df2b4e-6557-426d-b06d-bda244334381-federate-client-tls\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.552492 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.552475 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d4df2b4e-6557-426d-b06d-bda244334381-telemeter-client-tls\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.552759 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.552744 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d4df2b4e-6557-426d-b06d-bda244334381-secret-telemeter-client\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.552872 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.552851 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4df2b4e-6557-426d-b06d-bda244334381-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.559751 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.559729 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m9cq\" (UniqueName: \"kubernetes.io/projected/d4df2b4e-6557-426d-b06d-bda244334381-kube-api-access-8m9cq\") pod \"telemeter-client-8486f94db4-5fm4k\" (UID: \"d4df2b4e-6557-426d-b06d-bda244334381\") " pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.660344 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.660256 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" Apr 17 17:29:09.781975 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:09.781946 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-8486f94db4-5fm4k"] Apr 17 17:29:09.784931 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:29:09.784894 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4df2b4e_6557_426d_b06d_bda244334381.slice/crio-845d88f44f2bc93bcc4ab1ab3b7c52788daa321f76d797699ccfe28fcd45a043 WatchSource:0}: Error finding container 845d88f44f2bc93bcc4ab1ab3b7c52788daa321f76d797699ccfe28fcd45a043: Status 404 returned error can't find the container with id 845d88f44f2bc93bcc4ab1ab3b7c52788daa321f76d797699ccfe28fcd45a043 Apr 17 17:29:10.748558 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:10.748521 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" event={"ID":"d4df2b4e-6557-426d-b06d-bda244334381","Type":"ContainerStarted","Data":"845d88f44f2bc93bcc4ab1ab3b7c52788daa321f76d797699ccfe28fcd45a043"} Apr 17 17:29:12.755622 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:12.755591 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" event={"ID":"d4df2b4e-6557-426d-b06d-bda244334381","Type":"ContainerStarted","Data":"6c96b3ea1510159741090bc6b304acf4ec777a2b14c8b75d612c2ff715fd7d5a"} Apr 17 17:29:12.755622 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:12.755625 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" event={"ID":"d4df2b4e-6557-426d-b06d-bda244334381","Type":"ContainerStarted","Data":"4ffc6ed55c53996d47fbde9fc8dc110bd793132c4d85c7fae0617e0336f9d1b9"} Apr 17 17:29:12.756042 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:12.755636 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" event={"ID":"d4df2b4e-6557-426d-b06d-bda244334381","Type":"ContainerStarted","Data":"8221e4f5b38c650bb87e1e87b7a717dccbec407bd98a011e2303421fad5e5828"} Apr 17 17:29:12.787061 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:12.787009 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-8486f94db4-5fm4k" podStartSLOduration=1.8190866209999998 podStartE2EDuration="3.78699288s" podCreationTimestamp="2026-04-17 17:29:09 +0000 UTC" firstStartedPulling="2026-04-17 17:29:09.786724662 +0000 UTC m=+242.409005319" lastFinishedPulling="2026-04-17 17:29:11.754630926 +0000 UTC m=+244.376911578" observedRunningTime="2026-04-17 17:29:12.785240125 +0000 UTC m=+245.407520800" watchObservedRunningTime="2026-04-17 17:29:12.78699288 +0000 UTC m=+245.409273551" Apr 17 17:29:13.365196 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.365159 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-86c9487b9d-ssbgw"] Apr 17 17:29:13.367725 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.367702 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.378884 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.378862 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86c9487b9d-ssbgw"] Apr 17 17:29:13.484382 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.484344 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-oauth-config\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.484382 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.484383 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s66zv\" (UniqueName: \"kubernetes.io/projected/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-kube-api-access-s66zv\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.484589 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.484403 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-config\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.484589 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.484528 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-oauth-serving-cert\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.484589 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.484560 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-service-ca\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.484722 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.484593 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-serving-cert\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.484722 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.484653 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-trusted-ca-bundle\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.585404 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.585371 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s66zv\" (UniqueName: \"kubernetes.io/projected/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-kube-api-access-s66zv\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.585404 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.585407 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-config\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.585662 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.585451 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-oauth-serving-cert\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.585662 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.585479 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-service-ca\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.585662 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.585501 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-serving-cert\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.585662 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.585551 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-trusted-ca-bundle\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.585662 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.585580 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-oauth-config\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.586336 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.586303 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-oauth-serving-cert\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.586336 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.586327 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-service-ca\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.586522 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.586327 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-config\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.586522 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.586477 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-trusted-ca-bundle\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.588098 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.588074 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-oauth-config\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.588205 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.588179 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-serving-cert\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.596161 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.596134 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s66zv\" (UniqueName: \"kubernetes.io/projected/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-kube-api-access-s66zv\") pod \"console-86c9487b9d-ssbgw\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.676888 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.676799 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:13.818205 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:13.818134 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86c9487b9d-ssbgw"] Apr 17 17:29:13.820279 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:29:13.820250 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fddc433_31ea_46ab_a059_c2afe3a0bb9c.slice/crio-d7e9005ffa64f9d37ada745dc92cf2b05070ad9a9c7343b178854157487eecb6 WatchSource:0}: Error finding container d7e9005ffa64f9d37ada745dc92cf2b05070ad9a9c7343b178854157487eecb6: Status 404 returned error can't find the container with id d7e9005ffa64f9d37ada745dc92cf2b05070ad9a9c7343b178854157487eecb6 Apr 17 17:29:14.764462 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:14.764428 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c9487b9d-ssbgw" event={"ID":"6fddc433-31ea-46ab-a059-c2afe3a0bb9c","Type":"ContainerStarted","Data":"ede5d48c65871dc80054a8d0cb88e91b70933cd82259cb18403c8dbf3319b48c"} Apr 17 17:29:14.764462 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:14.764466 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c9487b9d-ssbgw" event={"ID":"6fddc433-31ea-46ab-a059-c2afe3a0bb9c","Type":"ContainerStarted","Data":"d7e9005ffa64f9d37ada745dc92cf2b05070ad9a9c7343b178854157487eecb6"} Apr 17 17:29:14.783465 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:14.783417 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86c9487b9d-ssbgw" podStartSLOduration=1.7834041809999999 podStartE2EDuration="1.783404181s" podCreationTimestamp="2026-04-17 17:29:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:29:14.782837395 +0000 UTC m=+247.405118070" watchObservedRunningTime="2026-04-17 17:29:14.783404181 +0000 UTC m=+247.405684855" Apr 17 17:29:19.847181 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:19.847124 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs\") pod \"network-metrics-daemon-fbmql\" (UID: \"173598bb-6dcc-46e9-a78f-f3d5c1fd4297\") " pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:29:19.849389 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:19.849368 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/173598bb-6dcc-46e9-a78f-f3d5c1fd4297-metrics-certs\") pod \"network-metrics-daemon-fbmql\" (UID: \"173598bb-6dcc-46e9-a78f-f3d5c1fd4297\") " pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:29:20.053143 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:20.053113 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lkqvr\"" Apr 17 17:29:20.061226 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:20.061207 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbmql" Apr 17 17:29:20.384170 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:20.384147 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fbmql"] Apr 17 17:29:20.386440 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:29:20.386413 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod173598bb_6dcc_46e9_a78f_f3d5c1fd4297.slice/crio-986308e538e4b9b97822f54157f553719d4bdcf9e31dc6260d5f3871c667b101 WatchSource:0}: Error finding container 986308e538e4b9b97822f54157f553719d4bdcf9e31dc6260d5f3871c667b101: Status 404 returned error can't find the container with id 986308e538e4b9b97822f54157f553719d4bdcf9e31dc6260d5f3871c667b101 Apr 17 17:29:20.783594 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:20.783508 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fbmql" event={"ID":"173598bb-6dcc-46e9-a78f-f3d5c1fd4297","Type":"ContainerStarted","Data":"986308e538e4b9b97822f54157f553719d4bdcf9e31dc6260d5f3871c667b101"} Apr 17 17:29:21.787902 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:21.787869 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fbmql" event={"ID":"173598bb-6dcc-46e9-a78f-f3d5c1fd4297","Type":"ContainerStarted","Data":"d6d5373e7722ebcee6d996ab508119b496c613eeb354e42e4ae6d28f58f8e5f8"} Apr 17 17:29:21.788260 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:21.787908 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fbmql" event={"ID":"173598bb-6dcc-46e9-a78f-f3d5c1fd4297","Type":"ContainerStarted","Data":"c8ccb6d58d9e2a2b4bdbc1ed4ac9f89521487b6aa821f947440cbba64e52e627"} Apr 17 17:29:21.804162 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:21.804104 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fbmql" podStartSLOduration=252.883168613 podStartE2EDuration="4m13.804084762s" podCreationTimestamp="2026-04-17 17:25:08 +0000 UTC" firstStartedPulling="2026-04-17 17:29:20.388276597 +0000 UTC m=+253.010557253" lastFinishedPulling="2026-04-17 17:29:21.309192748 +0000 UTC m=+253.931473402" observedRunningTime="2026-04-17 17:29:21.803452125 +0000 UTC m=+254.425732800" watchObservedRunningTime="2026-04-17 17:29:21.804084762 +0000 UTC m=+254.426365438" Apr 17 17:29:23.677508 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:23.677454 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:23.677868 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:23.677519 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:23.681857 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:23.681837 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:23.796737 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:23.796711 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:29:23.865749 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:23.865697 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8bf588c6c-fjlbl"] Apr 17 17:29:48.885637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:48.885594 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8bf588c6c-fjlbl" podUID="ca0f866a-8b99-4281-884c-c85069491b36" containerName="console" containerID="cri-o://0188eeac3a3cc39fde9cd9f4619776f714130465c7c2bca25aab0919a256aacd" gracePeriod=15 Apr 17 17:29:49.132770 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.132745 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8bf588c6c-fjlbl_ca0f866a-8b99-4281-884c-c85069491b36/console/0.log" Apr 17 17:29:49.132899 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.132844 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:29:49.190748 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.190656 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca0f866a-8b99-4281-884c-c85069491b36-console-serving-cert\") pod \"ca0f866a-8b99-4281-884c-c85069491b36\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " Apr 17 17:29:49.190877 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.190756 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cxg9\" (UniqueName: \"kubernetes.io/projected/ca0f866a-8b99-4281-884c-c85069491b36-kube-api-access-9cxg9\") pod \"ca0f866a-8b99-4281-884c-c85069491b36\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " Apr 17 17:29:49.190877 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.190784 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-oauth-serving-cert\") pod \"ca0f866a-8b99-4281-884c-c85069491b36\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " Apr 17 17:29:49.190877 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.190805 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-trusted-ca-bundle\") pod \"ca0f866a-8b99-4281-884c-c85069491b36\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " Apr 17 17:29:49.191009 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.190916 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-service-ca\") pod \"ca0f866a-8b99-4281-884c-c85069491b36\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " Apr 17 17:29:49.191009 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.190965 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca0f866a-8b99-4281-884c-c85069491b36-console-oauth-config\") pod \"ca0f866a-8b99-4281-884c-c85069491b36\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " Apr 17 17:29:49.191123 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.191098 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-console-config\") pod \"ca0f866a-8b99-4281-884c-c85069491b36\" (UID: \"ca0f866a-8b99-4281-884c-c85069491b36\") " Apr 17 17:29:49.191242 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.191220 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ca0f866a-8b99-4281-884c-c85069491b36" (UID: "ca0f866a-8b99-4281-884c-c85069491b36"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:49.191408 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.191368 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-service-ca" (OuterVolumeSpecName: "service-ca") pod "ca0f866a-8b99-4281-884c-c85069491b36" (UID: "ca0f866a-8b99-4281-884c-c85069491b36"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:49.191408 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.191389 2546 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-oauth-serving-cert\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:49.191526 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.191424 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-console-config" (OuterVolumeSpecName: "console-config") pod "ca0f866a-8b99-4281-884c-c85069491b36" (UID: "ca0f866a-8b99-4281-884c-c85069491b36"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:49.191526 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.191458 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ca0f866a-8b99-4281-884c-c85069491b36" (UID: "ca0f866a-8b99-4281-884c-c85069491b36"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:49.192837 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.192810 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca0f866a-8b99-4281-884c-c85069491b36-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ca0f866a-8b99-4281-884c-c85069491b36" (UID: "ca0f866a-8b99-4281-884c-c85069491b36"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:49.193336 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.193318 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca0f866a-8b99-4281-884c-c85069491b36-kube-api-access-9cxg9" (OuterVolumeSpecName: "kube-api-access-9cxg9") pod "ca0f866a-8b99-4281-884c-c85069491b36" (UID: "ca0f866a-8b99-4281-884c-c85069491b36"). InnerVolumeSpecName "kube-api-access-9cxg9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:29:49.193399 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.193318 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca0f866a-8b99-4281-884c-c85069491b36-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ca0f866a-8b99-4281-884c-c85069491b36" (UID: "ca0f866a-8b99-4281-884c-c85069491b36"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:49.292785 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.292749 2546 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca0f866a-8b99-4281-884c-c85069491b36-console-serving-cert\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:49.292785 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.292778 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9cxg9\" (UniqueName: \"kubernetes.io/projected/ca0f866a-8b99-4281-884c-c85069491b36-kube-api-access-9cxg9\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:49.292785 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.292789 2546 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-trusted-ca-bundle\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:49.293017 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.292798 2546 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-service-ca\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:49.293017 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.292807 2546 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca0f866a-8b99-4281-884c-c85069491b36-console-oauth-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:49.293017 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.292816 2546 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca0f866a-8b99-4281-884c-c85069491b36-console-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:29:49.869399 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.869374 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8bf588c6c-fjlbl_ca0f866a-8b99-4281-884c-c85069491b36/console/0.log" Apr 17 17:29:49.869602 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.869411 2546 generic.go:358] "Generic (PLEG): container finished" podID="ca0f866a-8b99-4281-884c-c85069491b36" containerID="0188eeac3a3cc39fde9cd9f4619776f714130465c7c2bca25aab0919a256aacd" exitCode=2 Apr 17 17:29:49.869602 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.869445 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8bf588c6c-fjlbl" event={"ID":"ca0f866a-8b99-4281-884c-c85069491b36","Type":"ContainerDied","Data":"0188eeac3a3cc39fde9cd9f4619776f714130465c7c2bca25aab0919a256aacd"} Apr 17 17:29:49.869602 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.869482 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8bf588c6c-fjlbl" Apr 17 17:29:49.869602 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.869500 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8bf588c6c-fjlbl" event={"ID":"ca0f866a-8b99-4281-884c-c85069491b36","Type":"ContainerDied","Data":"6bd0cf1c17322c594c1834a2dc8927e061aacc545edbac0255fbe41e9cc27e7a"} Apr 17 17:29:49.869602 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.869524 2546 scope.go:117] "RemoveContainer" containerID="0188eeac3a3cc39fde9cd9f4619776f714130465c7c2bca25aab0919a256aacd" Apr 17 17:29:49.880048 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.880031 2546 scope.go:117] "RemoveContainer" containerID="0188eeac3a3cc39fde9cd9f4619776f714130465c7c2bca25aab0919a256aacd" Apr 17 17:29:49.880318 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:29:49.880300 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0188eeac3a3cc39fde9cd9f4619776f714130465c7c2bca25aab0919a256aacd\": container with ID starting with 0188eeac3a3cc39fde9cd9f4619776f714130465c7c2bca25aab0919a256aacd not found: ID does not exist" containerID="0188eeac3a3cc39fde9cd9f4619776f714130465c7c2bca25aab0919a256aacd" Apr 17 17:29:49.880368 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.880327 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0188eeac3a3cc39fde9cd9f4619776f714130465c7c2bca25aab0919a256aacd"} err="failed to get container status \"0188eeac3a3cc39fde9cd9f4619776f714130465c7c2bca25aab0919a256aacd\": rpc error: code = NotFound desc = could not find container \"0188eeac3a3cc39fde9cd9f4619776f714130465c7c2bca25aab0919a256aacd\": container with ID starting with 0188eeac3a3cc39fde9cd9f4619776f714130465c7c2bca25aab0919a256aacd not found: ID does not exist" Apr 17 17:29:49.891869 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.891847 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8bf588c6c-fjlbl"] Apr 17 17:29:49.895343 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.895320 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8bf588c6c-fjlbl"] Apr 17 17:29:49.954215 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:29:49.954177 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca0f866a-8b99-4281-884c-c85069491b36" path="/var/lib/kubelet/pods/ca0f866a-8b99-4281-884c-c85069491b36/volumes" Apr 17 17:30:07.825163 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:07.825135 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovn-acl-logging/0.log" Apr 17 17:30:07.826611 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:07.826577 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovn-acl-logging/0.log" Apr 17 17:30:07.832096 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:07.832068 2546 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:30:29.134649 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.134616 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c4499b4f7-lhs25"] Apr 17 17:30:29.136968 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.134922 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca0f866a-8b99-4281-884c-c85069491b36" containerName="console" Apr 17 17:30:29.136968 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.134934 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0f866a-8b99-4281-884c-c85069491b36" containerName="console" Apr 17 17:30:29.136968 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.134994 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca0f866a-8b99-4281-884c-c85069491b36" containerName="console" Apr 17 17:30:29.137813 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.137797 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.150225 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.150202 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c4499b4f7-lhs25"] Apr 17 17:30:29.230185 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.230151 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-oauth-serving-cert\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.230185 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.230190 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-config\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.230408 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.230208 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-oauth-config\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.230408 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.230232 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-service-ca\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.230408 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.230254 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-trusted-ca-bundle\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.230408 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.230388 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg7ld\" (UniqueName: \"kubernetes.io/projected/6264fc53-ae53-460d-b9d8-ddd4db430c08-kube-api-access-bg7ld\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.230589 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.230449 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-serving-cert\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.331590 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.331546 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bg7ld\" (UniqueName: \"kubernetes.io/projected/6264fc53-ae53-460d-b9d8-ddd4db430c08-kube-api-access-bg7ld\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.331801 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.331603 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-serving-cert\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.331801 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.331726 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-oauth-serving-cert\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.331801 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.331772 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-config\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.331801 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.331793 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-oauth-config\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.332011 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.331810 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-service-ca\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.332011 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.331826 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-trusted-ca-bundle\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.332536 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.332511 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-config\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.332632 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.332572 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-oauth-serving-cert\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.332632 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.332601 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-service-ca\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.332832 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.332811 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-trusted-ca-bundle\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.334152 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.334122 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-serving-cert\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.334229 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.334200 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-oauth-config\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.342299 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.342276 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg7ld\" (UniqueName: \"kubernetes.io/projected/6264fc53-ae53-460d-b9d8-ddd4db430c08-kube-api-access-bg7ld\") pod \"console-5c4499b4f7-lhs25\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.446751 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.446610 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:29.571664 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.571597 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c4499b4f7-lhs25"] Apr 17 17:30:29.574355 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:30:29.574325 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6264fc53_ae53_460d_b9d8_ddd4db430c08.slice/crio-1d1ad9a0f90568ef5219321a1c1f7c3bc937dba57c2583a5748c508784ad1365 WatchSource:0}: Error finding container 1d1ad9a0f90568ef5219321a1c1f7c3bc937dba57c2583a5748c508784ad1365: Status 404 returned error can't find the container with id 1d1ad9a0f90568ef5219321a1c1f7c3bc937dba57c2583a5748c508784ad1365 Apr 17 17:30:29.576082 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.576064 2546 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:30:29.980640 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.980601 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c4499b4f7-lhs25" event={"ID":"6264fc53-ae53-460d-b9d8-ddd4db430c08","Type":"ContainerStarted","Data":"91af09575fc89786d3910898e2e352b65a745c10f59b60d12d1f73b94159b7dd"} Apr 17 17:30:29.980640 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:29.980639 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c4499b4f7-lhs25" event={"ID":"6264fc53-ae53-460d-b9d8-ddd4db430c08","Type":"ContainerStarted","Data":"1d1ad9a0f90568ef5219321a1c1f7c3bc937dba57c2583a5748c508784ad1365"} Apr 17 17:30:30.003822 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:30.003776 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c4499b4f7-lhs25" podStartSLOduration=1.003761814 podStartE2EDuration="1.003761814s" podCreationTimestamp="2026-04-17 17:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:30:30.002459225 +0000 UTC m=+322.624739926" watchObservedRunningTime="2026-04-17 17:30:30.003761814 +0000 UTC m=+322.626042488" Apr 17 17:30:39.447164 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:39.447126 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:39.447164 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:39.447168 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:39.451957 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:39.451932 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:40.014478 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:40.014451 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:30:40.065806 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:30:40.065769 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86c9487b9d-ssbgw"] Apr 17 17:31:05.087055 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.086959 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-86c9487b9d-ssbgw" podUID="6fddc433-31ea-46ab-a059-c2afe3a0bb9c" containerName="console" containerID="cri-o://ede5d48c65871dc80054a8d0cb88e91b70933cd82259cb18403c8dbf3319b48c" gracePeriod=15 Apr 17 17:31:05.317307 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.317279 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86c9487b9d-ssbgw_6fddc433-31ea-46ab-a059-c2afe3a0bb9c/console/0.log" Apr 17 17:31:05.317428 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.317350 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:31:05.424305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.424213 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s66zv\" (UniqueName: \"kubernetes.io/projected/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-kube-api-access-s66zv\") pod \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " Apr 17 17:31:05.424305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.424268 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-config\") pod \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " Apr 17 17:31:05.424305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.424297 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-serving-cert\") pod \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " Apr 17 17:31:05.424581 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.424337 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-oauth-serving-cert\") pod \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " Apr 17 17:31:05.424581 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.424354 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-service-ca\") pod \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " Apr 17 17:31:05.424581 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.424370 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-oauth-config\") pod \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " Apr 17 17:31:05.424581 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.424392 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-trusted-ca-bundle\") pod \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\" (UID: \"6fddc433-31ea-46ab-a059-c2afe3a0bb9c\") " Apr 17 17:31:05.424828 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.424798 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-service-ca" (OuterVolumeSpecName: "service-ca") pod "6fddc433-31ea-46ab-a059-c2afe3a0bb9c" (UID: "6fddc433-31ea-46ab-a059-c2afe3a0bb9c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:31:05.424879 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.424820 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-config" (OuterVolumeSpecName: "console-config") pod "6fddc433-31ea-46ab-a059-c2afe3a0bb9c" (UID: "6fddc433-31ea-46ab-a059-c2afe3a0bb9c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:31:05.424879 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.424845 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6fddc433-31ea-46ab-a059-c2afe3a0bb9c" (UID: "6fddc433-31ea-46ab-a059-c2afe3a0bb9c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:31:05.424985 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.424928 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6fddc433-31ea-46ab-a059-c2afe3a0bb9c" (UID: "6fddc433-31ea-46ab-a059-c2afe3a0bb9c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:31:05.426564 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.426538 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6fddc433-31ea-46ab-a059-c2afe3a0bb9c" (UID: "6fddc433-31ea-46ab-a059-c2afe3a0bb9c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:31:05.426564 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.426550 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-kube-api-access-s66zv" (OuterVolumeSpecName: "kube-api-access-s66zv") pod "6fddc433-31ea-46ab-a059-c2afe3a0bb9c" (UID: "6fddc433-31ea-46ab-a059-c2afe3a0bb9c"). InnerVolumeSpecName "kube-api-access-s66zv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:31:05.426718 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.426615 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6fddc433-31ea-46ab-a059-c2afe3a0bb9c" (UID: "6fddc433-31ea-46ab-a059-c2afe3a0bb9c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:31:05.526017 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.525980 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s66zv\" (UniqueName: \"kubernetes.io/projected/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-kube-api-access-s66zv\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:31:05.526017 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.526011 2546 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:31:05.526017 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.526020 2546 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-serving-cert\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:31:05.526254 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.526030 2546 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-oauth-serving-cert\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:31:05.526254 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.526039 2546 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-service-ca\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:31:05.526254 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.526048 2546 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-console-oauth-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:31:05.526254 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:05.526056 2546 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fddc433-31ea-46ab-a059-c2afe3a0bb9c-trusted-ca-bundle\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:31:06.089198 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:06.089172 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86c9487b9d-ssbgw_6fddc433-31ea-46ab-a059-c2afe3a0bb9c/console/0.log" Apr 17 17:31:06.089570 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:06.089211 2546 generic.go:358] "Generic (PLEG): container finished" podID="6fddc433-31ea-46ab-a059-c2afe3a0bb9c" containerID="ede5d48c65871dc80054a8d0cb88e91b70933cd82259cb18403c8dbf3319b48c" exitCode=2 Apr 17 17:31:06.089570 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:06.089249 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c9487b9d-ssbgw" event={"ID":"6fddc433-31ea-46ab-a059-c2afe3a0bb9c","Type":"ContainerDied","Data":"ede5d48c65871dc80054a8d0cb88e91b70933cd82259cb18403c8dbf3319b48c"} Apr 17 17:31:06.089570 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:06.089298 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86c9487b9d-ssbgw" event={"ID":"6fddc433-31ea-46ab-a059-c2afe3a0bb9c","Type":"ContainerDied","Data":"d7e9005ffa64f9d37ada745dc92cf2b05070ad9a9c7343b178854157487eecb6"} Apr 17 17:31:06.089570 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:06.089307 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86c9487b9d-ssbgw" Apr 17 17:31:06.089570 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:06.089314 2546 scope.go:117] "RemoveContainer" containerID="ede5d48c65871dc80054a8d0cb88e91b70933cd82259cb18403c8dbf3319b48c" Apr 17 17:31:06.097116 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:06.097100 2546 scope.go:117] "RemoveContainer" containerID="ede5d48c65871dc80054a8d0cb88e91b70933cd82259cb18403c8dbf3319b48c" Apr 17 17:31:06.097372 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:31:06.097357 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ede5d48c65871dc80054a8d0cb88e91b70933cd82259cb18403c8dbf3319b48c\": container with ID starting with ede5d48c65871dc80054a8d0cb88e91b70933cd82259cb18403c8dbf3319b48c not found: ID does not exist" containerID="ede5d48c65871dc80054a8d0cb88e91b70933cd82259cb18403c8dbf3319b48c" Apr 17 17:31:06.097421 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:06.097380 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ede5d48c65871dc80054a8d0cb88e91b70933cd82259cb18403c8dbf3319b48c"} err="failed to get container status \"ede5d48c65871dc80054a8d0cb88e91b70933cd82259cb18403c8dbf3319b48c\": rpc error: code = NotFound desc = could not find container \"ede5d48c65871dc80054a8d0cb88e91b70933cd82259cb18403c8dbf3319b48c\": container with ID starting with ede5d48c65871dc80054a8d0cb88e91b70933cd82259cb18403c8dbf3319b48c not found: ID does not exist" Apr 17 17:31:06.111820 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:06.111793 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86c9487b9d-ssbgw"] Apr 17 17:31:06.114584 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:06.114564 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-86c9487b9d-ssbgw"] Apr 17 17:31:07.102166 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.102138 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rrr95"] Apr 17 17:31:07.102548 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.102442 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fddc433-31ea-46ab-a059-c2afe3a0bb9c" containerName="console" Apr 17 17:31:07.102548 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.102453 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fddc433-31ea-46ab-a059-c2afe3a0bb9c" containerName="console" Apr 17 17:31:07.102548 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.102511 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fddc433-31ea-46ab-a059-c2afe3a0bb9c" containerName="console" Apr 17 17:31:07.106499 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.106483 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rrr95" Apr 17 17:31:07.111483 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.111467 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:31:07.122051 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.122027 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rrr95"] Apr 17 17:31:07.240043 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.240009 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7bb99d1-76a8-40ac-9a30-1ebde78e79f3-original-pull-secret\") pod \"global-pull-secret-syncer-rrr95\" (UID: \"b7bb99d1-76a8-40ac-9a30-1ebde78e79f3\") " pod="kube-system/global-pull-secret-syncer-rrr95" Apr 17 17:31:07.240253 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.240057 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b7bb99d1-76a8-40ac-9a30-1ebde78e79f3-dbus\") pod \"global-pull-secret-syncer-rrr95\" (UID: \"b7bb99d1-76a8-40ac-9a30-1ebde78e79f3\") " pod="kube-system/global-pull-secret-syncer-rrr95" Apr 17 17:31:07.240253 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.240171 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b7bb99d1-76a8-40ac-9a30-1ebde78e79f3-kubelet-config\") pod \"global-pull-secret-syncer-rrr95\" (UID: \"b7bb99d1-76a8-40ac-9a30-1ebde78e79f3\") " pod="kube-system/global-pull-secret-syncer-rrr95" Apr 17 17:31:07.340981 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.340948 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b7bb99d1-76a8-40ac-9a30-1ebde78e79f3-dbus\") pod \"global-pull-secret-syncer-rrr95\" (UID: \"b7bb99d1-76a8-40ac-9a30-1ebde78e79f3\") " pod="kube-system/global-pull-secret-syncer-rrr95" Apr 17 17:31:07.341136 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.341012 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b7bb99d1-76a8-40ac-9a30-1ebde78e79f3-kubelet-config\") pod \"global-pull-secret-syncer-rrr95\" (UID: \"b7bb99d1-76a8-40ac-9a30-1ebde78e79f3\") " pod="kube-system/global-pull-secret-syncer-rrr95" Apr 17 17:31:07.341136 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.341051 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7bb99d1-76a8-40ac-9a30-1ebde78e79f3-original-pull-secret\") pod \"global-pull-secret-syncer-rrr95\" (UID: \"b7bb99d1-76a8-40ac-9a30-1ebde78e79f3\") " pod="kube-system/global-pull-secret-syncer-rrr95" Apr 17 17:31:07.341233 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.341153 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b7bb99d1-76a8-40ac-9a30-1ebde78e79f3-kubelet-config\") pod \"global-pull-secret-syncer-rrr95\" (UID: \"b7bb99d1-76a8-40ac-9a30-1ebde78e79f3\") " pod="kube-system/global-pull-secret-syncer-rrr95" Apr 17 17:31:07.341233 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.341167 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b7bb99d1-76a8-40ac-9a30-1ebde78e79f3-dbus\") pod \"global-pull-secret-syncer-rrr95\" (UID: \"b7bb99d1-76a8-40ac-9a30-1ebde78e79f3\") " pod="kube-system/global-pull-secret-syncer-rrr95" Apr 17 17:31:07.343287 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.343271 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7bb99d1-76a8-40ac-9a30-1ebde78e79f3-original-pull-secret\") pod \"global-pull-secret-syncer-rrr95\" (UID: \"b7bb99d1-76a8-40ac-9a30-1ebde78e79f3\") " pod="kube-system/global-pull-secret-syncer-rrr95" Apr 17 17:31:07.415504 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.415429 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rrr95" Apr 17 17:31:07.534721 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.534671 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rrr95"] Apr 17 17:31:07.537843 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:31:07.537809 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7bb99d1_76a8_40ac_9a30_1ebde78e79f3.slice/crio-39fd54c3e97775dd73885eecf08f1417b272e8ba348aef74fead52b57d2aac51 WatchSource:0}: Error finding container 39fd54c3e97775dd73885eecf08f1417b272e8ba348aef74fead52b57d2aac51: Status 404 returned error can't find the container with id 39fd54c3e97775dd73885eecf08f1417b272e8ba348aef74fead52b57d2aac51 Apr 17 17:31:07.953556 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:07.953528 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fddc433-31ea-46ab-a059-c2afe3a0bb9c" path="/var/lib/kubelet/pods/6fddc433-31ea-46ab-a059-c2afe3a0bb9c/volumes" Apr 17 17:31:08.096611 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:08.096525 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rrr95" event={"ID":"b7bb99d1-76a8-40ac-9a30-1ebde78e79f3","Type":"ContainerStarted","Data":"39fd54c3e97775dd73885eecf08f1417b272e8ba348aef74fead52b57d2aac51"} Apr 17 17:31:12.109754 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:12.109713 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rrr95" event={"ID":"b7bb99d1-76a8-40ac-9a30-1ebde78e79f3","Type":"ContainerStarted","Data":"4ed0ed3d9da71cd0b63c8278b43103c80fd6b918cde81a30c97270be6b48f255"} Apr 17 17:31:12.126392 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:12.126344 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rrr95" podStartSLOduration=1.089943348 podStartE2EDuration="5.126331021s" podCreationTimestamp="2026-04-17 17:31:07 +0000 UTC" firstStartedPulling="2026-04-17 17:31:07.539388914 +0000 UTC m=+360.161669567" lastFinishedPulling="2026-04-17 17:31:11.575776584 +0000 UTC m=+364.198057240" observedRunningTime="2026-04-17 17:31:12.125527425 +0000 UTC m=+364.747808101" watchObservedRunningTime="2026-04-17 17:31:12.126331021 +0000 UTC m=+364.748611697" Apr 17 17:31:55.159132 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.159096 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9"] Apr 17 17:31:55.161395 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.161380 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" Apr 17 17:31:55.164118 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.164100 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:31:55.164118 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.164104 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:31:55.164993 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.164978 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9w26p\"" Apr 17 17:31:55.171538 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.171514 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9"] Apr 17 17:31:55.239492 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.239455 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c99a28c-8d3f-4207-8b1e-bc39f80025da-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9\" (UID: \"1c99a28c-8d3f-4207-8b1e-bc39f80025da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" Apr 17 17:31:55.239655 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.239508 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c99a28c-8d3f-4207-8b1e-bc39f80025da-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9\" (UID: \"1c99a28c-8d3f-4207-8b1e-bc39f80025da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" Apr 17 17:31:55.239655 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.239553 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp75j\" (UniqueName: \"kubernetes.io/projected/1c99a28c-8d3f-4207-8b1e-bc39f80025da-kube-api-access-pp75j\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9\" (UID: \"1c99a28c-8d3f-4207-8b1e-bc39f80025da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" Apr 17 17:31:55.340950 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.340915 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c99a28c-8d3f-4207-8b1e-bc39f80025da-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9\" (UID: \"1c99a28c-8d3f-4207-8b1e-bc39f80025da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" Apr 17 17:31:55.340950 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.340954 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c99a28c-8d3f-4207-8b1e-bc39f80025da-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9\" (UID: \"1c99a28c-8d3f-4207-8b1e-bc39f80025da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" Apr 17 17:31:55.341147 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.340986 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pp75j\" (UniqueName: \"kubernetes.io/projected/1c99a28c-8d3f-4207-8b1e-bc39f80025da-kube-api-access-pp75j\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9\" (UID: \"1c99a28c-8d3f-4207-8b1e-bc39f80025da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" Apr 17 17:31:55.341327 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.341310 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c99a28c-8d3f-4207-8b1e-bc39f80025da-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9\" (UID: \"1c99a28c-8d3f-4207-8b1e-bc39f80025da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" Apr 17 17:31:55.341363 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.341339 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c99a28c-8d3f-4207-8b1e-bc39f80025da-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9\" (UID: \"1c99a28c-8d3f-4207-8b1e-bc39f80025da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" Apr 17 17:31:55.349339 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.349310 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp75j\" (UniqueName: \"kubernetes.io/projected/1c99a28c-8d3f-4207-8b1e-bc39f80025da-kube-api-access-pp75j\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9\" (UID: \"1c99a28c-8d3f-4207-8b1e-bc39f80025da\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" Apr 17 17:31:55.470828 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.470747 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" Apr 17 17:31:55.588598 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:55.588426 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9"] Apr 17 17:31:55.591222 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:31:55.591193 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c99a28c_8d3f_4207_8b1e_bc39f80025da.slice/crio-79cc46923521f106888f1387ec035d6f0fb1977ea6124aeac94670ef832f6342 WatchSource:0}: Error finding container 79cc46923521f106888f1387ec035d6f0fb1977ea6124aeac94670ef832f6342: Status 404 returned error can't find the container with id 79cc46923521f106888f1387ec035d6f0fb1977ea6124aeac94670ef832f6342 Apr 17 17:31:56.240989 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:31:56.240950 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" event={"ID":"1c99a28c-8d3f-4207-8b1e-bc39f80025da","Type":"ContainerStarted","Data":"79cc46923521f106888f1387ec035d6f0fb1977ea6124aeac94670ef832f6342"} Apr 17 17:32:01.257147 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:01.257058 2546 generic.go:358] "Generic (PLEG): container finished" podID="1c99a28c-8d3f-4207-8b1e-bc39f80025da" containerID="71714001e4e1a74d07a613ed245801a975015311f7f4b61d99765c6595f530fb" exitCode=0 Apr 17 17:32:01.257492 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:01.257150 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" event={"ID":"1c99a28c-8d3f-4207-8b1e-bc39f80025da","Type":"ContainerDied","Data":"71714001e4e1a74d07a613ed245801a975015311f7f4b61d99765c6595f530fb"} Apr 17 17:32:04.267759 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:04.267726 2546 generic.go:358] "Generic (PLEG): container finished" podID="1c99a28c-8d3f-4207-8b1e-bc39f80025da" containerID="a56f1d74f4a351d83060335b80b60ac2c0340e91acef85ae10acf3a58072eb9b" exitCode=0 Apr 17 17:32:04.268124 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:04.267768 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" event={"ID":"1c99a28c-8d3f-4207-8b1e-bc39f80025da","Type":"ContainerDied","Data":"a56f1d74f4a351d83060335b80b60ac2c0340e91acef85ae10acf3a58072eb9b"} Apr 17 17:32:13.298596 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:13.298558 2546 generic.go:358] "Generic (PLEG): container finished" podID="1c99a28c-8d3f-4207-8b1e-bc39f80025da" containerID="ad3ebffb56044a924150e27379c2e0062ae6f0d62aaff92193ecdba7f8284a25" exitCode=0 Apr 17 17:32:13.298988 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:13.298639 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" event={"ID":"1c99a28c-8d3f-4207-8b1e-bc39f80025da","Type":"ContainerDied","Data":"ad3ebffb56044a924150e27379c2e0062ae6f0d62aaff92193ecdba7f8284a25"} Apr 17 17:32:14.420632 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:14.420610 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" Apr 17 17:32:14.525576 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:14.525536 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c99a28c-8d3f-4207-8b1e-bc39f80025da-bundle\") pod \"1c99a28c-8d3f-4207-8b1e-bc39f80025da\" (UID: \"1c99a28c-8d3f-4207-8b1e-bc39f80025da\") " Apr 17 17:32:14.525576 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:14.525584 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp75j\" (UniqueName: \"kubernetes.io/projected/1c99a28c-8d3f-4207-8b1e-bc39f80025da-kube-api-access-pp75j\") pod \"1c99a28c-8d3f-4207-8b1e-bc39f80025da\" (UID: \"1c99a28c-8d3f-4207-8b1e-bc39f80025da\") " Apr 17 17:32:14.525846 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:14.525664 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c99a28c-8d3f-4207-8b1e-bc39f80025da-util\") pod \"1c99a28c-8d3f-4207-8b1e-bc39f80025da\" (UID: \"1c99a28c-8d3f-4207-8b1e-bc39f80025da\") " Apr 17 17:32:14.526146 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:14.526119 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c99a28c-8d3f-4207-8b1e-bc39f80025da-bundle" (OuterVolumeSpecName: "bundle") pod "1c99a28c-8d3f-4207-8b1e-bc39f80025da" (UID: "1c99a28c-8d3f-4207-8b1e-bc39f80025da"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:32:14.528052 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:14.528026 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c99a28c-8d3f-4207-8b1e-bc39f80025da-kube-api-access-pp75j" (OuterVolumeSpecName: "kube-api-access-pp75j") pod "1c99a28c-8d3f-4207-8b1e-bc39f80025da" (UID: "1c99a28c-8d3f-4207-8b1e-bc39f80025da"). InnerVolumeSpecName "kube-api-access-pp75j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:32:14.529610 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:14.529591 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c99a28c-8d3f-4207-8b1e-bc39f80025da-util" (OuterVolumeSpecName: "util") pod "1c99a28c-8d3f-4207-8b1e-bc39f80025da" (UID: "1c99a28c-8d3f-4207-8b1e-bc39f80025da"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:32:14.627253 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:14.627169 2546 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c99a28c-8d3f-4207-8b1e-bc39f80025da-bundle\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:32:14.627253 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:14.627200 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pp75j\" (UniqueName: \"kubernetes.io/projected/1c99a28c-8d3f-4207-8b1e-bc39f80025da-kube-api-access-pp75j\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:32:14.627253 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:14.627209 2546 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c99a28c-8d3f-4207-8b1e-bc39f80025da-util\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:32:15.308586 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:15.308554 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" event={"ID":"1c99a28c-8d3f-4207-8b1e-bc39f80025da","Type":"ContainerDied","Data":"79cc46923521f106888f1387ec035d6f0fb1977ea6124aeac94670ef832f6342"} Apr 17 17:32:15.308586 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:15.308589 2546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79cc46923521f106888f1387ec035d6f0fb1977ea6124aeac94670ef832f6342" Apr 17 17:32:15.308899 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:15.308566 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg4xd9" Apr 17 17:32:17.477360 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.477326 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q"] Apr 17 17:32:17.477952 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.477640 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c99a28c-8d3f-4207-8b1e-bc39f80025da" containerName="pull" Apr 17 17:32:17.477952 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.477652 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c99a28c-8d3f-4207-8b1e-bc39f80025da" containerName="pull" Apr 17 17:32:17.477952 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.477663 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c99a28c-8d3f-4207-8b1e-bc39f80025da" containerName="util" Apr 17 17:32:17.477952 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.477668 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c99a28c-8d3f-4207-8b1e-bc39f80025da" containerName="util" Apr 17 17:32:17.477952 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.477702 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c99a28c-8d3f-4207-8b1e-bc39f80025da" containerName="extract" Apr 17 17:32:17.477952 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.477711 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c99a28c-8d3f-4207-8b1e-bc39f80025da" containerName="extract" Apr 17 17:32:17.477952 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.477774 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c99a28c-8d3f-4207-8b1e-bc39f80025da" containerName="extract" Apr 17 17:32:17.495594 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.495568 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q" Apr 17 17:32:17.514635 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.514609 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 17:32:17.514762 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.514697 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-bgkpm\"" Apr 17 17:32:17.515004 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.514989 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 17:32:17.515344 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.515321 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 17:32:17.518127 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.518104 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q"] Apr 17 17:32:17.651132 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.651091 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/be1f72e7-4061-4971-813c-10b67ec366de-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q\" (UID: \"be1f72e7-4061-4971-813c-10b67ec366de\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q" Apr 17 17:32:17.651296 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.651205 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kxtw\" (UniqueName: \"kubernetes.io/projected/be1f72e7-4061-4971-813c-10b67ec366de-kube-api-access-5kxtw\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q\" (UID: \"be1f72e7-4061-4971-813c-10b67ec366de\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q" Apr 17 17:32:17.752411 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.752321 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/be1f72e7-4061-4971-813c-10b67ec366de-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q\" (UID: \"be1f72e7-4061-4971-813c-10b67ec366de\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q" Apr 17 17:32:17.752411 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.752383 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kxtw\" (UniqueName: \"kubernetes.io/projected/be1f72e7-4061-4971-813c-10b67ec366de-kube-api-access-5kxtw\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q\" (UID: \"be1f72e7-4061-4971-813c-10b67ec366de\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q" Apr 17 17:32:17.754803 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.754768 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/be1f72e7-4061-4971-813c-10b67ec366de-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q\" (UID: \"be1f72e7-4061-4971-813c-10b67ec366de\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q" Apr 17 17:32:17.778073 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.778050 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kxtw\" (UniqueName: \"kubernetes.io/projected/be1f72e7-4061-4971-813c-10b67ec366de-kube-api-access-5kxtw\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q\" (UID: \"be1f72e7-4061-4971-813c-10b67ec366de\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q" Apr 17 17:32:17.805591 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.805568 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q" Apr 17 17:32:17.944264 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:17.944236 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q"] Apr 17 17:32:17.947288 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:32:17.947259 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe1f72e7_4061_4971_813c_10b67ec366de.slice/crio-a87958ea41644d4be90a2c65571bc6885a591235d6dd5d84c4ed1923351ea97e WatchSource:0}: Error finding container a87958ea41644d4be90a2c65571bc6885a591235d6dd5d84c4ed1923351ea97e: Status 404 returned error can't find the container with id a87958ea41644d4be90a2c65571bc6885a591235d6dd5d84c4ed1923351ea97e Apr 17 17:32:18.317874 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:18.317838 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q" event={"ID":"be1f72e7-4061-4971-813c-10b67ec366de","Type":"ContainerStarted","Data":"a87958ea41644d4be90a2c65571bc6885a591235d6dd5d84c4ed1923351ea97e"} Apr 17 17:32:22.333979 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.333932 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q" event={"ID":"be1f72e7-4061-4971-813c-10b67ec366de","Type":"ContainerStarted","Data":"3691880ee247c5d83a8fe06cdf5c1bb1244072e150bd28401bfe522230323188"} Apr 17 17:32:22.334364 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.334098 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q" Apr 17 17:32:22.769126 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.769065 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q" podStartSLOduration=1.561661644 podStartE2EDuration="5.769045987s" podCreationTimestamp="2026-04-17 17:32:17 +0000 UTC" firstStartedPulling="2026-04-17 17:32:17.94910954 +0000 UTC m=+430.571390193" lastFinishedPulling="2026-04-17 17:32:22.156493868 +0000 UTC m=+434.778774536" observedRunningTime="2026-04-17 17:32:22.376666971 +0000 UTC m=+434.998947645" watchObservedRunningTime="2026-04-17 17:32:22.769045987 +0000 UTC m=+435.391326663" Apr 17 17:32:22.770516 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.770497 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5mjmc"] Apr 17 17:32:22.773856 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.773837 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:22.776202 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.776182 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 17 17:32:22.776325 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.776308 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 17:32:22.776369 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.776308 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-p7mt4\"" Apr 17 17:32:22.782136 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.782114 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5mjmc"] Apr 17 17:32:22.794314 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.794291 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnxsl\" (UniqueName: \"kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-kube-api-access-mnxsl\") pod \"keda-operator-ffbb595cb-5mjmc\" (UID: \"eeec8a83-8c97-4310-b447-44e6ee88065a\") " pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:22.794456 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.794327 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-certificates\") pod \"keda-operator-ffbb595cb-5mjmc\" (UID: \"eeec8a83-8c97-4310-b447-44e6ee88065a\") " pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:22.794456 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.794367 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/eeec8a83-8c97-4310-b447-44e6ee88065a-cabundle0\") pod \"keda-operator-ffbb595cb-5mjmc\" (UID: \"eeec8a83-8c97-4310-b447-44e6ee88065a\") " pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:22.894958 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.894926 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-certificates\") pod \"keda-operator-ffbb595cb-5mjmc\" (UID: \"eeec8a83-8c97-4310-b447-44e6ee88065a\") " pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:22.894958 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.894971 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/eeec8a83-8c97-4310-b447-44e6ee88065a-cabundle0\") pod \"keda-operator-ffbb595cb-5mjmc\" (UID: \"eeec8a83-8c97-4310-b447-44e6ee88065a\") " pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:22.895213 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.895037 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxsl\" (UniqueName: \"kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-kube-api-access-mnxsl\") pod \"keda-operator-ffbb595cb-5mjmc\" (UID: \"eeec8a83-8c97-4310-b447-44e6ee88065a\") " pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:22.895213 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:22.895080 2546 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 17 17:32:22.895213 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:22.895102 2546 secret.go:281] references non-existent secret key: ca.crt Apr 17 17:32:22.895213 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:22.895110 2546 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 17:32:22.895213 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:22.895123 2546 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5mjmc: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 17 17:32:22.895213 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:22.895173 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-certificates podName:eeec8a83-8c97-4310-b447-44e6ee88065a nodeName:}" failed. No retries permitted until 2026-04-17 17:32:23.395158165 +0000 UTC m=+436.017438818 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-certificates") pod "keda-operator-ffbb595cb-5mjmc" (UID: "eeec8a83-8c97-4310-b447-44e6ee88065a") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 17 17:32:22.895760 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.895738 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/eeec8a83-8c97-4310-b447-44e6ee88065a-cabundle0\") pod \"keda-operator-ffbb595cb-5mjmc\" (UID: \"eeec8a83-8c97-4310-b447-44e6ee88065a\") " pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:22.909566 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:22.909536 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnxsl\" (UniqueName: \"kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-kube-api-access-mnxsl\") pod \"keda-operator-ffbb595cb-5mjmc\" (UID: \"eeec8a83-8c97-4310-b447-44e6ee88065a\") " pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:23.398744 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:23.398706 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-certificates\") pod \"keda-operator-ffbb595cb-5mjmc\" (UID: \"eeec8a83-8c97-4310-b447-44e6ee88065a\") " pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:23.399237 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:23.398791 2546 secret.go:281] references non-existent secret key: ca.crt Apr 17 17:32:23.399237 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:23.398808 2546 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 17:32:23.399237 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:23.398819 2546 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5mjmc: references non-existent secret key: ca.crt Apr 17 17:32:23.399237 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:23.398873 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-certificates podName:eeec8a83-8c97-4310-b447-44e6ee88065a nodeName:}" failed. No retries permitted until 2026-04-17 17:32:24.398852349 +0000 UTC m=+437.021133002 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-certificates") pod "keda-operator-ffbb595cb-5mjmc" (UID: "eeec8a83-8c97-4310-b447-44e6ee88065a") : references non-existent secret key: ca.crt Apr 17 17:32:24.409311 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:24.409270 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-certificates\") pod \"keda-operator-ffbb595cb-5mjmc\" (UID: \"eeec8a83-8c97-4310-b447-44e6ee88065a\") " pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:24.409738 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:24.409390 2546 secret.go:281] references non-existent secret key: ca.crt Apr 17 17:32:24.409738 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:24.409402 2546 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 17:32:24.409738 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:24.409410 2546 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5mjmc: references non-existent secret key: ca.crt Apr 17 17:32:24.409738 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:24.409454 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-certificates podName:eeec8a83-8c97-4310-b447-44e6ee88065a nodeName:}" failed. No retries permitted until 2026-04-17 17:32:26.409441328 +0000 UTC m=+439.031721981 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-certificates") pod "keda-operator-ffbb595cb-5mjmc" (UID: "eeec8a83-8c97-4310-b447-44e6ee88065a") : references non-existent secret key: ca.crt Apr 17 17:32:26.426622 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:26.426579 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-certificates\") pod \"keda-operator-ffbb595cb-5mjmc\" (UID: \"eeec8a83-8c97-4310-b447-44e6ee88065a\") " pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:26.427122 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:26.426777 2546 secret.go:281] references non-existent secret key: ca.crt Apr 17 17:32:26.427122 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:26.426804 2546 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 17:32:26.427122 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:26.426817 2546 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5mjmc: references non-existent secret key: ca.crt Apr 17 17:32:26.427122 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:32:26.426886 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-certificates podName:eeec8a83-8c97-4310-b447-44e6ee88065a nodeName:}" failed. No retries permitted until 2026-04-17 17:32:30.426865839 +0000 UTC m=+443.049146495 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-certificates") pod "keda-operator-ffbb595cb-5mjmc" (UID: "eeec8a83-8c97-4310-b447-44e6ee88065a") : references non-existent secret key: ca.crt Apr 17 17:32:30.460364 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:30.460310 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-certificates\") pod \"keda-operator-ffbb595cb-5mjmc\" (UID: \"eeec8a83-8c97-4310-b447-44e6ee88065a\") " pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:30.462766 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:30.462746 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/eeec8a83-8c97-4310-b447-44e6ee88065a-certificates\") pod \"keda-operator-ffbb595cb-5mjmc\" (UID: \"eeec8a83-8c97-4310-b447-44e6ee88065a\") " pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:30.584903 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:30.584870 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:30.702826 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:30.702800 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5mjmc"] Apr 17 17:32:30.705360 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:32:30.705335 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeec8a83_8c97_4310_b447_44e6ee88065a.slice/crio-989dbb78e46865c4179048b9bd8dcddc9500ac659344b4addf6e2f346c3316d7 WatchSource:0}: Error finding container 989dbb78e46865c4179048b9bd8dcddc9500ac659344b4addf6e2f346c3316d7: Status 404 returned error can't find the container with id 989dbb78e46865c4179048b9bd8dcddc9500ac659344b4addf6e2f346c3316d7 Apr 17 17:32:31.362846 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:31.362811 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" event={"ID":"eeec8a83-8c97-4310-b447-44e6ee88065a","Type":"ContainerStarted","Data":"989dbb78e46865c4179048b9bd8dcddc9500ac659344b4addf6e2f346c3316d7"} Apr 17 17:32:34.374243 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:34.374207 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" event={"ID":"eeec8a83-8c97-4310-b447-44e6ee88065a","Type":"ContainerStarted","Data":"5ad8ca35584610dff9a0daae5607b98b85954809eb649b12275ed31271643ca6"} Apr 17 17:32:34.374647 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:34.374288 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:32:34.391262 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:34.391213 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" podStartSLOduration=9.084828953 podStartE2EDuration="12.391201999s" podCreationTimestamp="2026-04-17 17:32:22 +0000 UTC" firstStartedPulling="2026-04-17 17:32:30.706566303 +0000 UTC m=+443.328846957" lastFinishedPulling="2026-04-17 17:32:34.012939351 +0000 UTC m=+446.635220003" observedRunningTime="2026-04-17 17:32:34.38912286 +0000 UTC m=+447.011403534" watchObservedRunningTime="2026-04-17 17:32:34.391201999 +0000 UTC m=+447.013482674" Apr 17 17:32:43.339247 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:43.339216 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jnh4q" Apr 17 17:32:55.379586 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:32:55.379555 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-5mjmc" Apr 17 17:33:29.172755 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.172690 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-rlbkf"] Apr 17 17:33:29.175818 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.175799 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" Apr 17 17:33:29.178036 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.178007 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 17 17:33:29.178155 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.178085 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 17:33:29.178155 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.178145 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-x2wbd\"" Apr 17 17:33:29.178813 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.178799 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 17:33:29.184504 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.184482 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq"] Apr 17 17:33:29.187857 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.187840 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-rlbkf"] Apr 17 17:33:29.187960 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.187949 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq" Apr 17 17:33:29.190140 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.190115 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 17:33:29.190238 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.190157 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-8vzkp\"" Apr 17 17:33:29.196741 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.196719 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq"] Apr 17 17:33:29.247493 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.247465 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3cff457-b524-4a38-849a-22f35dec88e0-cert\") pod \"kserve-controller-manager-85dd7cfb4d-rlbkf\" (UID: \"b3cff457-b524-4a38-849a-22f35dec88e0\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" Apr 17 17:33:29.247493 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.247495 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbbq8\" (UniqueName: \"kubernetes.io/projected/b3cff457-b524-4a38-849a-22f35dec88e0-kube-api-access-bbbq8\") pod \"kserve-controller-manager-85dd7cfb4d-rlbkf\" (UID: \"b3cff457-b524-4a38-849a-22f35dec88e0\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" Apr 17 17:33:29.247723 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.247518 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d8ad29e-c7bc-476f-a723-5c96ac217a68-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-s5cxq\" (UID: \"0d8ad29e-c7bc-476f-a723-5c96ac217a68\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq" Apr 17 17:33:29.247723 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.247619 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jjm8\" (UniqueName: \"kubernetes.io/projected/0d8ad29e-c7bc-476f-a723-5c96ac217a68-kube-api-access-4jjm8\") pod \"llmisvc-controller-manager-68cc5db7c4-s5cxq\" (UID: \"0d8ad29e-c7bc-476f-a723-5c96ac217a68\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq" Apr 17 17:33:29.348200 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.348166 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jjm8\" (UniqueName: \"kubernetes.io/projected/0d8ad29e-c7bc-476f-a723-5c96ac217a68-kube-api-access-4jjm8\") pod \"llmisvc-controller-manager-68cc5db7c4-s5cxq\" (UID: \"0d8ad29e-c7bc-476f-a723-5c96ac217a68\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq" Apr 17 17:33:29.348371 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.348244 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3cff457-b524-4a38-849a-22f35dec88e0-cert\") pod \"kserve-controller-manager-85dd7cfb4d-rlbkf\" (UID: \"b3cff457-b524-4a38-849a-22f35dec88e0\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" Apr 17 17:33:29.348371 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.348262 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbbq8\" (UniqueName: \"kubernetes.io/projected/b3cff457-b524-4a38-849a-22f35dec88e0-kube-api-access-bbbq8\") pod \"kserve-controller-manager-85dd7cfb4d-rlbkf\" (UID: \"b3cff457-b524-4a38-849a-22f35dec88e0\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" Apr 17 17:33:29.348371 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.348278 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d8ad29e-c7bc-476f-a723-5c96ac217a68-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-s5cxq\" (UID: \"0d8ad29e-c7bc-476f-a723-5c96ac217a68\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq" Apr 17 17:33:29.348533 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:33:29.348377 2546 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 17:33:29.348533 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:33:29.348447 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3cff457-b524-4a38-849a-22f35dec88e0-cert podName:b3cff457-b524-4a38-849a-22f35dec88e0 nodeName:}" failed. No retries permitted until 2026-04-17 17:33:29.84842848 +0000 UTC m=+502.470709142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b3cff457-b524-4a38-849a-22f35dec88e0-cert") pod "kserve-controller-manager-85dd7cfb4d-rlbkf" (UID: "b3cff457-b524-4a38-849a-22f35dec88e0") : secret "kserve-webhook-server-cert" not found Apr 17 17:33:29.350702 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.350649 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d8ad29e-c7bc-476f-a723-5c96ac217a68-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-s5cxq\" (UID: \"0d8ad29e-c7bc-476f-a723-5c96ac217a68\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq" Apr 17 17:33:29.363208 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.363180 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jjm8\" (UniqueName: \"kubernetes.io/projected/0d8ad29e-c7bc-476f-a723-5c96ac217a68-kube-api-access-4jjm8\") pod \"llmisvc-controller-manager-68cc5db7c4-s5cxq\" (UID: \"0d8ad29e-c7bc-476f-a723-5c96ac217a68\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq" Apr 17 17:33:29.367367 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.367343 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbbq8\" (UniqueName: \"kubernetes.io/projected/b3cff457-b524-4a38-849a-22f35dec88e0-kube-api-access-bbbq8\") pod \"kserve-controller-manager-85dd7cfb4d-rlbkf\" (UID: \"b3cff457-b524-4a38-849a-22f35dec88e0\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" Apr 17 17:33:29.498903 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.498811 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq" Apr 17 17:33:29.622926 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.622893 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq"] Apr 17 17:33:29.626465 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:33:29.626426 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0d8ad29e_c7bc_476f_a723_5c96ac217a68.slice/crio-a9a17b49a229eae0a8e0b8cfa3a502f9a35166eadf02815cf12d1e08a8d5c997 WatchSource:0}: Error finding container a9a17b49a229eae0a8e0b8cfa3a502f9a35166eadf02815cf12d1e08a8d5c997: Status 404 returned error can't find the container with id a9a17b49a229eae0a8e0b8cfa3a502f9a35166eadf02815cf12d1e08a8d5c997 Apr 17 17:33:29.853342 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.853296 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3cff457-b524-4a38-849a-22f35dec88e0-cert\") pod \"kserve-controller-manager-85dd7cfb4d-rlbkf\" (UID: \"b3cff457-b524-4a38-849a-22f35dec88e0\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" Apr 17 17:33:29.855552 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:29.855524 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3cff457-b524-4a38-849a-22f35dec88e0-cert\") pod \"kserve-controller-manager-85dd7cfb4d-rlbkf\" (UID: \"b3cff457-b524-4a38-849a-22f35dec88e0\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" Apr 17 17:33:30.087138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:30.087095 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" Apr 17 17:33:30.225416 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:30.225384 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-rlbkf"] Apr 17 17:33:30.229046 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:33:30.229015 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3cff457_b524_4a38_849a_22f35dec88e0.slice/crio-95f12e47945961d8ace5a196e87150e71a76a8c868e64a7053c02795ea88db26 WatchSource:0}: Error finding container 95f12e47945961d8ace5a196e87150e71a76a8c868e64a7053c02795ea88db26: Status 404 returned error can't find the container with id 95f12e47945961d8ace5a196e87150e71a76a8c868e64a7053c02795ea88db26 Apr 17 17:33:30.554706 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:30.554650 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" event={"ID":"b3cff457-b524-4a38-849a-22f35dec88e0","Type":"ContainerStarted","Data":"95f12e47945961d8ace5a196e87150e71a76a8c868e64a7053c02795ea88db26"} Apr 17 17:33:30.555955 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:30.555917 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq" event={"ID":"0d8ad29e-c7bc-476f-a723-5c96ac217a68","Type":"ContainerStarted","Data":"a9a17b49a229eae0a8e0b8cfa3a502f9a35166eadf02815cf12d1e08a8d5c997"} Apr 17 17:33:33.569430 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:33.569393 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq" event={"ID":"0d8ad29e-c7bc-476f-a723-5c96ac217a68","Type":"ContainerStarted","Data":"edfbc8d492fd11b84c2c2b9733f05d13d410d687563ee602a2f4e9169653faa9"} Apr 17 17:33:33.569920 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:33.569470 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq" Apr 17 17:33:33.570769 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:33.570747 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" event={"ID":"b3cff457-b524-4a38-849a-22f35dec88e0","Type":"ContainerStarted","Data":"982f5b6bba733944134d316926529ed2a99c8cb7791950968328bc4cc8afc1bb"} Apr 17 17:33:33.570901 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:33.570859 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" Apr 17 17:33:33.585774 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:33.585731 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq" podStartSLOduration=1.215398516 podStartE2EDuration="4.585716255s" podCreationTimestamp="2026-04-17 17:33:29 +0000 UTC" firstStartedPulling="2026-04-17 17:33:29.628169343 +0000 UTC m=+502.250449996" lastFinishedPulling="2026-04-17 17:33:32.998487076 +0000 UTC m=+505.620767735" observedRunningTime="2026-04-17 17:33:33.584574726 +0000 UTC m=+506.206855401" watchObservedRunningTime="2026-04-17 17:33:33.585716255 +0000 UTC m=+506.207996979" Apr 17 17:33:33.601565 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:33:33.601517 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" podStartSLOduration=1.786930483 podStartE2EDuration="4.601503277s" podCreationTimestamp="2026-04-17 17:33:29 +0000 UTC" firstStartedPulling="2026-04-17 17:33:30.230674016 +0000 UTC m=+502.852954673" lastFinishedPulling="2026-04-17 17:33:33.045246797 +0000 UTC m=+505.667527467" observedRunningTime="2026-04-17 17:33:33.599360146 +0000 UTC m=+506.221640833" watchObservedRunningTime="2026-04-17 17:33:33.601503277 +0000 UTC m=+506.223783929" Apr 17 17:34:04.576776 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:04.576702 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-s5cxq" Apr 17 17:34:04.579735 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:04.579712 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" Apr 17 17:34:06.000171 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.000134 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-rlbkf"] Apr 17 17:34:06.000551 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.000350 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" podUID="b3cff457-b524-4a38-849a-22f35dec88e0" containerName="manager" containerID="cri-o://982f5b6bba733944134d316926529ed2a99c8cb7791950968328bc4cc8afc1bb" gracePeriod=10 Apr 17 17:34:06.025379 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.025344 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-bbwsx"] Apr 17 17:34:06.027594 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.027580 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-bbwsx" Apr 17 17:34:06.036609 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.036583 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-bbwsx"] Apr 17 17:34:06.168242 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.168216 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dcae6a1-3266-4e8a-bc79-fcd36bffaa2c-cert\") pod \"kserve-controller-manager-85dd7cfb4d-bbwsx\" (UID: \"0dcae6a1-3266-4e8a-bc79-fcd36bffaa2c\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-bbwsx" Apr 17 17:34:06.168360 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.168261 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq2fn\" (UniqueName: \"kubernetes.io/projected/0dcae6a1-3266-4e8a-bc79-fcd36bffaa2c-kube-api-access-zq2fn\") pod \"kserve-controller-manager-85dd7cfb4d-bbwsx\" (UID: \"0dcae6a1-3266-4e8a-bc79-fcd36bffaa2c\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-bbwsx" Apr 17 17:34:06.237433 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.237411 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" Apr 17 17:34:06.269536 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.269452 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dcae6a1-3266-4e8a-bc79-fcd36bffaa2c-cert\") pod \"kserve-controller-manager-85dd7cfb4d-bbwsx\" (UID: \"0dcae6a1-3266-4e8a-bc79-fcd36bffaa2c\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-bbwsx" Apr 17 17:34:06.269691 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.269535 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq2fn\" (UniqueName: \"kubernetes.io/projected/0dcae6a1-3266-4e8a-bc79-fcd36bffaa2c-kube-api-access-zq2fn\") pod \"kserve-controller-manager-85dd7cfb4d-bbwsx\" (UID: \"0dcae6a1-3266-4e8a-bc79-fcd36bffaa2c\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-bbwsx" Apr 17 17:34:06.272058 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.272027 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dcae6a1-3266-4e8a-bc79-fcd36bffaa2c-cert\") pod \"kserve-controller-manager-85dd7cfb4d-bbwsx\" (UID: \"0dcae6a1-3266-4e8a-bc79-fcd36bffaa2c\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-bbwsx" Apr 17 17:34:06.278357 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.278330 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq2fn\" (UniqueName: \"kubernetes.io/projected/0dcae6a1-3266-4e8a-bc79-fcd36bffaa2c-kube-api-access-zq2fn\") pod \"kserve-controller-manager-85dd7cfb4d-bbwsx\" (UID: \"0dcae6a1-3266-4e8a-bc79-fcd36bffaa2c\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-bbwsx" Apr 17 17:34:06.370532 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.370501 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3cff457-b524-4a38-849a-22f35dec88e0-cert\") pod \"b3cff457-b524-4a38-849a-22f35dec88e0\" (UID: \"b3cff457-b524-4a38-849a-22f35dec88e0\") " Apr 17 17:34:06.370752 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.370561 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbbq8\" (UniqueName: \"kubernetes.io/projected/b3cff457-b524-4a38-849a-22f35dec88e0-kube-api-access-bbbq8\") pod \"b3cff457-b524-4a38-849a-22f35dec88e0\" (UID: \"b3cff457-b524-4a38-849a-22f35dec88e0\") " Apr 17 17:34:06.372661 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.372635 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3cff457-b524-4a38-849a-22f35dec88e0-cert" (OuterVolumeSpecName: "cert") pod "b3cff457-b524-4a38-849a-22f35dec88e0" (UID: "b3cff457-b524-4a38-849a-22f35dec88e0"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:34:06.372752 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.372699 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3cff457-b524-4a38-849a-22f35dec88e0-kube-api-access-bbbq8" (OuterVolumeSpecName: "kube-api-access-bbbq8") pod "b3cff457-b524-4a38-849a-22f35dec88e0" (UID: "b3cff457-b524-4a38-849a-22f35dec88e0"). InnerVolumeSpecName "kube-api-access-bbbq8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:34:06.376779 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.376763 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-bbwsx" Apr 17 17:34:06.472289 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.472260 2546 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3cff457-b524-4a38-849a-22f35dec88e0-cert\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:34:06.472289 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.472289 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bbbq8\" (UniqueName: \"kubernetes.io/projected/b3cff457-b524-4a38-849a-22f35dec88e0-kube-api-access-bbbq8\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:34:06.491960 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.491908 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-bbwsx"] Apr 17 17:34:06.494145 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:34:06.494120 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dcae6a1_3266_4e8a_bc79_fcd36bffaa2c.slice/crio-728364e1e0c0ad5baecbd774418c192ea8453fdd76098e2ffdc3f7e407e8166a WatchSource:0}: Error finding container 728364e1e0c0ad5baecbd774418c192ea8453fdd76098e2ffdc3f7e407e8166a: Status 404 returned error can't find the container with id 728364e1e0c0ad5baecbd774418c192ea8453fdd76098e2ffdc3f7e407e8166a Apr 17 17:34:06.678245 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.678209 2546 generic.go:358] "Generic (PLEG): container finished" podID="b3cff457-b524-4a38-849a-22f35dec88e0" containerID="982f5b6bba733944134d316926529ed2a99c8cb7791950968328bc4cc8afc1bb" exitCode=0 Apr 17 17:34:06.678413 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.678277 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" Apr 17 17:34:06.678413 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.678285 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" event={"ID":"b3cff457-b524-4a38-849a-22f35dec88e0","Type":"ContainerDied","Data":"982f5b6bba733944134d316926529ed2a99c8cb7791950968328bc4cc8afc1bb"} Apr 17 17:34:06.678413 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.678321 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-rlbkf" event={"ID":"b3cff457-b524-4a38-849a-22f35dec88e0","Type":"ContainerDied","Data":"95f12e47945961d8ace5a196e87150e71a76a8c868e64a7053c02795ea88db26"} Apr 17 17:34:06.678413 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.678337 2546 scope.go:117] "RemoveContainer" containerID="982f5b6bba733944134d316926529ed2a99c8cb7791950968328bc4cc8afc1bb" Apr 17 17:34:06.679697 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.679649 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-bbwsx" event={"ID":"0dcae6a1-3266-4e8a-bc79-fcd36bffaa2c","Type":"ContainerStarted","Data":"728364e1e0c0ad5baecbd774418c192ea8453fdd76098e2ffdc3f7e407e8166a"} Apr 17 17:34:06.686437 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.686415 2546 scope.go:117] "RemoveContainer" containerID="982f5b6bba733944134d316926529ed2a99c8cb7791950968328bc4cc8afc1bb" Apr 17 17:34:06.686738 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:34:06.686713 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"982f5b6bba733944134d316926529ed2a99c8cb7791950968328bc4cc8afc1bb\": container with ID starting with 982f5b6bba733944134d316926529ed2a99c8cb7791950968328bc4cc8afc1bb not found: ID does not exist" containerID="982f5b6bba733944134d316926529ed2a99c8cb7791950968328bc4cc8afc1bb" Apr 17 17:34:06.686832 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.686743 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"982f5b6bba733944134d316926529ed2a99c8cb7791950968328bc4cc8afc1bb"} err="failed to get container status \"982f5b6bba733944134d316926529ed2a99c8cb7791950968328bc4cc8afc1bb\": rpc error: code = NotFound desc = could not find container \"982f5b6bba733944134d316926529ed2a99c8cb7791950968328bc4cc8afc1bb\": container with ID starting with 982f5b6bba733944134d316926529ed2a99c8cb7791950968328bc4cc8afc1bb not found: ID does not exist" Apr 17 17:34:06.699217 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.699189 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-rlbkf"] Apr 17 17:34:06.702915 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:06.702895 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-rlbkf"] Apr 17 17:34:07.685274 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:07.685240 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-bbwsx" event={"ID":"0dcae6a1-3266-4e8a-bc79-fcd36bffaa2c","Type":"ContainerStarted","Data":"850ef952a6d013b501aa51af523944570618978a5db3b9a3b26d2714bef29453"} Apr 17 17:34:07.685782 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:07.685373 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-bbwsx" Apr 17 17:34:07.700720 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:07.700632 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-85dd7cfb4d-bbwsx" podStartSLOduration=1.328963986 podStartE2EDuration="1.70061639s" podCreationTimestamp="2026-04-17 17:34:06 +0000 UTC" firstStartedPulling="2026-04-17 17:34:06.495388005 +0000 UTC m=+539.117668661" lastFinishedPulling="2026-04-17 17:34:06.86704041 +0000 UTC m=+539.489321065" observedRunningTime="2026-04-17 17:34:07.700033173 +0000 UTC m=+540.322313848" watchObservedRunningTime="2026-04-17 17:34:07.70061639 +0000 UTC m=+540.322897069" Apr 17 17:34:07.954704 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:07.954603 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3cff457-b524-4a38-849a-22f35dec88e0" path="/var/lib/kubelet/pods/b3cff457-b524-4a38-849a-22f35dec88e0/volumes" Apr 17 17:34:18.121755 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.121717 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-78565777f8-64tbj"] Apr 17 17:34:18.122260 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.122241 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3cff457-b524-4a38-849a-22f35dec88e0" containerName="manager" Apr 17 17:34:18.122342 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.122264 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3cff457-b524-4a38-849a-22f35dec88e0" containerName="manager" Apr 17 17:34:18.122400 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.122378 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3cff457-b524-4a38-849a-22f35dec88e0" containerName="manager" Apr 17 17:34:18.125018 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.124997 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.136537 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.136513 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78565777f8-64tbj"] Apr 17 17:34:18.169133 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.169104 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f48d5\" (UniqueName: \"kubernetes.io/projected/b27b925a-4241-4612-8244-b96ad45d3c7b-kube-api-access-f48d5\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.169294 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.169145 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b27b925a-4241-4612-8244-b96ad45d3c7b-console-config\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.169294 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.169164 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27b925a-4241-4612-8244-b96ad45d3c7b-trusted-ca-bundle\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.169294 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.169182 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b27b925a-4241-4612-8244-b96ad45d3c7b-console-serving-cert\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.169294 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.169200 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b27b925a-4241-4612-8244-b96ad45d3c7b-service-ca\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.169294 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.169282 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b27b925a-4241-4612-8244-b96ad45d3c7b-oauth-serving-cert\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.169455 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.169340 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b27b925a-4241-4612-8244-b96ad45d3c7b-console-oauth-config\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.270010 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.269982 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b27b925a-4241-4612-8244-b96ad45d3c7b-oauth-serving-cert\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.270195 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.270028 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b27b925a-4241-4612-8244-b96ad45d3c7b-console-oauth-config\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.270195 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.270074 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f48d5\" (UniqueName: \"kubernetes.io/projected/b27b925a-4241-4612-8244-b96ad45d3c7b-kube-api-access-f48d5\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.270195 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.270184 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b27b925a-4241-4612-8244-b96ad45d3c7b-console-config\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.270385 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.270214 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27b925a-4241-4612-8244-b96ad45d3c7b-trusted-ca-bundle\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.270385 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.270239 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b27b925a-4241-4612-8244-b96ad45d3c7b-console-serving-cert\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.270385 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.270269 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b27b925a-4241-4612-8244-b96ad45d3c7b-service-ca\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.270818 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.270788 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b27b925a-4241-4612-8244-b96ad45d3c7b-oauth-serving-cert\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.270943 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.270919 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b27b925a-4241-4612-8244-b96ad45d3c7b-console-config\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.271081 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.271064 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b27b925a-4241-4612-8244-b96ad45d3c7b-service-ca\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.271119 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.271070 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27b925a-4241-4612-8244-b96ad45d3c7b-trusted-ca-bundle\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.273033 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.273007 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b27b925a-4241-4612-8244-b96ad45d3c7b-console-serving-cert\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.273125 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.273104 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b27b925a-4241-4612-8244-b96ad45d3c7b-console-oauth-config\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.284058 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.284034 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f48d5\" (UniqueName: \"kubernetes.io/projected/b27b925a-4241-4612-8244-b96ad45d3c7b-kube-api-access-f48d5\") pod \"console-78565777f8-64tbj\" (UID: \"b27b925a-4241-4612-8244-b96ad45d3c7b\") " pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.436250 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.436169 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:18.561609 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.561584 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78565777f8-64tbj"] Apr 17 17:34:18.564228 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:34:18.564200 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb27b925a_4241_4612_8244_b96ad45d3c7b.slice/crio-27df07ba703a3bfcc3ef987d5531376fd51541b0a9f82a40b35326669c4d4a2c WatchSource:0}: Error finding container 27df07ba703a3bfcc3ef987d5531376fd51541b0a9f82a40b35326669c4d4a2c: Status 404 returned error can't find the container with id 27df07ba703a3bfcc3ef987d5531376fd51541b0a9f82a40b35326669c4d4a2c Apr 17 17:34:18.721367 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.721282 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78565777f8-64tbj" event={"ID":"b27b925a-4241-4612-8244-b96ad45d3c7b","Type":"ContainerStarted","Data":"5fa9e76cd0697a3918b96187aa794fea6d9e8e8c8e65f5f16cc10c173111437a"} Apr 17 17:34:18.721367 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.721325 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78565777f8-64tbj" event={"ID":"b27b925a-4241-4612-8244-b96ad45d3c7b","Type":"ContainerStarted","Data":"27df07ba703a3bfcc3ef987d5531376fd51541b0a9f82a40b35326669c4d4a2c"} Apr 17 17:34:18.738586 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:18.738539 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78565777f8-64tbj" podStartSLOduration=0.738525089 podStartE2EDuration="738.525089ms" podCreationTimestamp="2026-04-17 17:34:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:34:18.737562717 +0000 UTC m=+551.359843405" watchObservedRunningTime="2026-04-17 17:34:18.738525089 +0000 UTC m=+551.360805763" Apr 17 17:34:28.436807 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:28.436772 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:28.436807 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:28.436814 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:28.441701 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:28.441656 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:28.757884 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:28.757807 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78565777f8-64tbj" Apr 17 17:34:28.802854 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:28.802811 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c4499b4f7-lhs25"] Apr 17 17:34:38.692306 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:38.692277 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-bbwsx" Apr 17 17:34:53.824312 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:53.824261 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c4499b4f7-lhs25" podUID="6264fc53-ae53-460d-b9d8-ddd4db430c08" containerName="console" containerID="cri-o://91af09575fc89786d3910898e2e352b65a745c10f59b60d12d1f73b94159b7dd" gracePeriod=15 Apr 17 17:34:54.057436 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.057413 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c4499b4f7-lhs25_6264fc53-ae53-460d-b9d8-ddd4db430c08/console/0.log" Apr 17 17:34:54.057561 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.057473 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:34:54.185740 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.185645 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-service-ca\") pod \"6264fc53-ae53-460d-b9d8-ddd4db430c08\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " Apr 17 17:34:54.185740 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.185711 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-oauth-config\") pod \"6264fc53-ae53-460d-b9d8-ddd4db430c08\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " Apr 17 17:34:54.185927 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.185765 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg7ld\" (UniqueName: \"kubernetes.io/projected/6264fc53-ae53-460d-b9d8-ddd4db430c08-kube-api-access-bg7ld\") pod \"6264fc53-ae53-460d-b9d8-ddd4db430c08\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " Apr 17 17:34:54.185927 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.185802 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-config\") pod \"6264fc53-ae53-460d-b9d8-ddd4db430c08\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " Apr 17 17:34:54.185927 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.185823 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-oauth-serving-cert\") pod \"6264fc53-ae53-460d-b9d8-ddd4db430c08\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " Apr 17 17:34:54.185927 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.185847 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-serving-cert\") pod \"6264fc53-ae53-460d-b9d8-ddd4db430c08\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " Apr 17 17:34:54.185927 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.185871 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-trusted-ca-bundle\") pod \"6264fc53-ae53-460d-b9d8-ddd4db430c08\" (UID: \"6264fc53-ae53-460d-b9d8-ddd4db430c08\") " Apr 17 17:34:54.186179 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.186115 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-service-ca" (OuterVolumeSpecName: "service-ca") pod "6264fc53-ae53-460d-b9d8-ddd4db430c08" (UID: "6264fc53-ae53-460d-b9d8-ddd4db430c08"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:34:54.186354 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.186324 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-config" (OuterVolumeSpecName: "console-config") pod "6264fc53-ae53-460d-b9d8-ddd4db430c08" (UID: "6264fc53-ae53-460d-b9d8-ddd4db430c08"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:34:54.186354 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.186339 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6264fc53-ae53-460d-b9d8-ddd4db430c08" (UID: "6264fc53-ae53-460d-b9d8-ddd4db430c08"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:34:54.186540 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.186484 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6264fc53-ae53-460d-b9d8-ddd4db430c08" (UID: "6264fc53-ae53-460d-b9d8-ddd4db430c08"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:34:54.187847 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.187822 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6264fc53-ae53-460d-b9d8-ddd4db430c08" (UID: "6264fc53-ae53-460d-b9d8-ddd4db430c08"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:34:54.187971 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.187946 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6264fc53-ae53-460d-b9d8-ddd4db430c08" (UID: "6264fc53-ae53-460d-b9d8-ddd4db430c08"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:34:54.188013 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.187988 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6264fc53-ae53-460d-b9d8-ddd4db430c08-kube-api-access-bg7ld" (OuterVolumeSpecName: "kube-api-access-bg7ld") pod "6264fc53-ae53-460d-b9d8-ddd4db430c08" (UID: "6264fc53-ae53-460d-b9d8-ddd4db430c08"). InnerVolumeSpecName "kube-api-access-bg7ld". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:34:54.286985 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.286959 2546 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:34:54.286985 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.286984 2546 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-oauth-serving-cert\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:34:54.286985 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.286994 2546 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-serving-cert\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:34:54.287185 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.287002 2546 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-trusted-ca-bundle\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:34:54.287185 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.287011 2546 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6264fc53-ae53-460d-b9d8-ddd4db430c08-service-ca\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:34:54.287185 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.287020 2546 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6264fc53-ae53-460d-b9d8-ddd4db430c08-console-oauth-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:34:54.287185 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.287029 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bg7ld\" (UniqueName: \"kubernetes.io/projected/6264fc53-ae53-460d-b9d8-ddd4db430c08-kube-api-access-bg7ld\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:34:54.838152 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.838126 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c4499b4f7-lhs25_6264fc53-ae53-460d-b9d8-ddd4db430c08/console/0.log" Apr 17 17:34:54.838539 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.838166 2546 generic.go:358] "Generic (PLEG): container finished" podID="6264fc53-ae53-460d-b9d8-ddd4db430c08" containerID="91af09575fc89786d3910898e2e352b65a745c10f59b60d12d1f73b94159b7dd" exitCode=2 Apr 17 17:34:54.838539 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.838204 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c4499b4f7-lhs25" event={"ID":"6264fc53-ae53-460d-b9d8-ddd4db430c08","Type":"ContainerDied","Data":"91af09575fc89786d3910898e2e352b65a745c10f59b60d12d1f73b94159b7dd"} Apr 17 17:34:54.838539 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.838234 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c4499b4f7-lhs25" Apr 17 17:34:54.838539 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.838256 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c4499b4f7-lhs25" event={"ID":"6264fc53-ae53-460d-b9d8-ddd4db430c08","Type":"ContainerDied","Data":"1d1ad9a0f90568ef5219321a1c1f7c3bc937dba57c2583a5748c508784ad1365"} Apr 17 17:34:54.838539 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.838278 2546 scope.go:117] "RemoveContainer" containerID="91af09575fc89786d3910898e2e352b65a745c10f59b60d12d1f73b94159b7dd" Apr 17 17:34:54.846910 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.846889 2546 scope.go:117] "RemoveContainer" containerID="91af09575fc89786d3910898e2e352b65a745c10f59b60d12d1f73b94159b7dd" Apr 17 17:34:54.847174 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:34:54.847155 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91af09575fc89786d3910898e2e352b65a745c10f59b60d12d1f73b94159b7dd\": container with ID starting with 91af09575fc89786d3910898e2e352b65a745c10f59b60d12d1f73b94159b7dd not found: ID does not exist" containerID="91af09575fc89786d3910898e2e352b65a745c10f59b60d12d1f73b94159b7dd" Apr 17 17:34:54.847240 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.847184 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91af09575fc89786d3910898e2e352b65a745c10f59b60d12d1f73b94159b7dd"} err="failed to get container status \"91af09575fc89786d3910898e2e352b65a745c10f59b60d12d1f73b94159b7dd\": rpc error: code = NotFound desc = could not find container \"91af09575fc89786d3910898e2e352b65a745c10f59b60d12d1f73b94159b7dd\": container with ID starting with 91af09575fc89786d3910898e2e352b65a745c10f59b60d12d1f73b94159b7dd not found: ID does not exist" Apr 17 17:34:54.860175 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.860151 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c4499b4f7-lhs25"] Apr 17 17:34:54.863251 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:54.863232 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c4499b4f7-lhs25"] Apr 17 17:34:55.953936 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:34:55.953906 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6264fc53-ae53-460d-b9d8-ddd4db430c08" path="/var/lib/kubelet/pods/6264fc53-ae53-460d-b9d8-ddd4db430c08/volumes" Apr 17 17:35:07.849830 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:07.849804 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovn-acl-logging/0.log" Apr 17 17:35:07.850248 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:07.850047 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovn-acl-logging/0.log" Apr 17 17:35:15.776357 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.776322 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl"] Apr 17 17:35:15.776869 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.776698 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6264fc53-ae53-460d-b9d8-ddd4db430c08" containerName="console" Apr 17 17:35:15.776869 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.776715 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6264fc53-ae53-460d-b9d8-ddd4db430c08" containerName="console" Apr 17 17:35:15.776869 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.776799 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="6264fc53-ae53-460d-b9d8-ddd4db430c08" containerName="console" Apr 17 17:35:15.779890 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.779869 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:15.782114 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.782087 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-a3ac3-predictor-serving-cert\"" Apr 17 17:35:15.782239 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.782150 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-a3ac3-kube-rbac-proxy-sar-config\"" Apr 17 17:35:15.782339 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.782268 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gt7v2\"" Apr 17 17:35:15.782339 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.782309 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:35:15.782475 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.782370 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:35:15.793773 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.790494 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl"] Apr 17 17:35:15.867760 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.867722 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-747lw\" (UniqueName: \"kubernetes.io/projected/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-kube-api-access-747lw\") pod \"isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:15.867931 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.867768 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-proxy-tls\") pod \"isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:15.867931 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.867862 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-raw-sklearn-batcher-a3ac3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-isvc-raw-sklearn-batcher-a3ac3-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:15.867931 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.867887 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:15.968659 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.968625 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-proxy-tls\") pod \"isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:15.968876 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.968759 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-raw-sklearn-batcher-a3ac3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-isvc-raw-sklearn-batcher-a3ac3-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:15.968876 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.968803 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:15.968876 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.968844 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-747lw\" (UniqueName: \"kubernetes.io/projected/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-kube-api-access-747lw\") pod \"isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:15.968876 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:35:15.968870 2546 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-serving-cert: secret "isvc-raw-sklearn-batcher-a3ac3-predictor-serving-cert" not found Apr 17 17:35:15.969218 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:35:15.968966 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-proxy-tls podName:a625b39f-1c6b-43ca-bb35-7cf75a7b0432 nodeName:}" failed. No retries permitted until 2026-04-17 17:35:16.468931493 +0000 UTC m=+609.091212160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-proxy-tls") pod "isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" (UID: "a625b39f-1c6b-43ca-bb35-7cf75a7b0432") : secret "isvc-raw-sklearn-batcher-a3ac3-predictor-serving-cert" not found Apr 17 17:35:15.969271 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.969236 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:15.969472 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.969454 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-raw-sklearn-batcher-a3ac3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-isvc-raw-sklearn-batcher-a3ac3-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:15.977914 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:15.977895 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-747lw\" (UniqueName: \"kubernetes.io/projected/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-kube-api-access-747lw\") pod \"isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:16.471992 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:16.471955 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-proxy-tls\") pod \"isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:16.474357 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:16.474338 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-proxy-tls\") pod \"isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:16.696005 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:16.695962 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:16.820043 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:16.820018 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl"] Apr 17 17:35:16.822319 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:35:16.822288 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda625b39f_1c6b_43ca_bb35_7cf75a7b0432.slice/crio-da857eb8b1a09b2776cf115ed6bd0188f33e0a2e807270963faec60f0aa7d123 WatchSource:0}: Error finding container da857eb8b1a09b2776cf115ed6bd0188f33e0a2e807270963faec60f0aa7d123: Status 404 returned error can't find the container with id da857eb8b1a09b2776cf115ed6bd0188f33e0a2e807270963faec60f0aa7d123 Apr 17 17:35:16.913454 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:16.913423 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" event={"ID":"a625b39f-1c6b-43ca-bb35-7cf75a7b0432","Type":"ContainerStarted","Data":"da857eb8b1a09b2776cf115ed6bd0188f33e0a2e807270963faec60f0aa7d123"} Apr 17 17:35:19.926965 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:19.926927 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" event={"ID":"a625b39f-1c6b-43ca-bb35-7cf75a7b0432","Type":"ContainerStarted","Data":"d93d5b2792b3f6f3608d4ae6be7b50e1b6bf6ccb3530220056551cffc9399e41"} Apr 17 17:35:23.944770 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:23.944734 2546 generic.go:358] "Generic (PLEG): container finished" podID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerID="d93d5b2792b3f6f3608d4ae6be7b50e1b6bf6ccb3530220056551cffc9399e41" exitCode=0 Apr 17 17:35:23.945165 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:23.944807 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" event={"ID":"a625b39f-1c6b-43ca-bb35-7cf75a7b0432","Type":"ContainerDied","Data":"d93d5b2792b3f6f3608d4ae6be7b50e1b6bf6ccb3530220056551cffc9399e41"} Apr 17 17:35:36.124647 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:36.124628 2546 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:35:37.000526 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:37.000477 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" event={"ID":"a625b39f-1c6b-43ca-bb35-7cf75a7b0432","Type":"ContainerStarted","Data":"f4347a2abc554551256a0383cf7a553a4a6ff36a198d35244be7d7dca4a464a0"} Apr 17 17:35:39.009560 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:39.009524 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" event={"ID":"a625b39f-1c6b-43ca-bb35-7cf75a7b0432","Type":"ContainerStarted","Data":"30ada2641bdbeecc7edca42be130778437d1bc83d0fdf02c9fa16d2f3693cc00"} Apr 17 17:35:42.022272 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:42.022234 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" event={"ID":"a625b39f-1c6b-43ca-bb35-7cf75a7b0432","Type":"ContainerStarted","Data":"a41ef51a160c8ea85c7e3f44db2f325a019a3fc1589953da393d924060c0780e"} Apr 17 17:35:42.022641 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:42.022483 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:42.022641 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:42.022608 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:42.023933 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:42.023904 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:35:42.044457 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:42.044405 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podStartSLOduration=2.642728447 podStartE2EDuration="27.044388823s" podCreationTimestamp="2026-04-17 17:35:15 +0000 UTC" firstStartedPulling="2026-04-17 17:35:16.824378264 +0000 UTC m=+609.446658917" lastFinishedPulling="2026-04-17 17:35:41.22603864 +0000 UTC m=+633.848319293" observedRunningTime="2026-04-17 17:35:42.042668794 +0000 UTC m=+634.664949481" watchObservedRunningTime="2026-04-17 17:35:42.044388823 +0000 UTC m=+634.666669500" Apr 17 17:35:43.025630 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:43.025594 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:43.026115 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:43.025771 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:35:43.026855 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:43.026821 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:43.030193 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:43.030174 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:35:44.029214 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:44.029174 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:35:44.029656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:44.029464 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:45.032605 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:45.032562 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:35:45.033023 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:45.032924 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:55.032735 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:55.032659 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:35:55.033306 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:35:55.033114 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:36:05.032934 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:05.032886 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:36:05.033356 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:05.033327 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:36:15.033077 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:15.033026 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:36:15.033551 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:15.033452 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:36:25.032722 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:25.032652 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:36:25.033163 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:25.033071 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:36:35.033238 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:35.033187 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:36:35.033716 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:35.033596 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:36:45.033362 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:45.033329 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:36:45.033825 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:45.033614 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:36:50.992865 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:50.992766 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl"] Apr 17 17:36:50.993321 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:50.993261 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" containerID="cri-o://f4347a2abc554551256a0383cf7a553a4a6ff36a198d35244be7d7dca4a464a0" gracePeriod=30 Apr 17 17:36:50.993489 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:50.993430 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="agent" containerID="cri-o://a41ef51a160c8ea85c7e3f44db2f325a019a3fc1589953da393d924060c0780e" gracePeriod=30 Apr 17 17:36:50.993853 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:50.993782 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kube-rbac-proxy" containerID="cri-o://30ada2641bdbeecc7edca42be130778437d1bc83d0fdf02c9fa16d2f3693cc00" gracePeriod=30 Apr 17 17:36:51.096669 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.096639 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp"] Apr 17 17:36:51.100821 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.100797 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:51.103705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.103649 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-c3f82-predictor-serving-cert\"" Apr 17 17:36:51.103853 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.103660 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-c3f82-kube-rbac-proxy-sar-config\"" Apr 17 17:36:51.111749 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.111719 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp"] Apr 17 17:36:51.160726 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.160674 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn"] Apr 17 17:36:51.164463 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.164443 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:36:51.166641 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.166625 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-c3f82-predictor-serving-cert\"" Apr 17 17:36:51.166765 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.166650 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-c3f82-kube-rbac-proxy-sar-config\"" Apr 17 17:36:51.176920 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.176893 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn"] Apr 17 17:36:51.242529 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.242497 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f610c89b-d09c-430d-90f0-b490ebd3d466-proxy-tls\") pod \"isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:51.242722 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.242544 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt6m4\" (UniqueName: \"kubernetes.io/projected/f610c89b-d09c-430d-90f0-b490ebd3d466-kube-api-access-xt6m4\") pod \"isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:51.242722 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.242609 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-c3f82-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db934d6a-6776-4aca-8a20-86bcd0da3864-isvc-xgboost-graph-raw-c3f82-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:36:51.242722 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.242637 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8dph\" (UniqueName: \"kubernetes.io/projected/db934d6a-6776-4aca-8a20-86bcd0da3864-kube-api-access-l8dph\") pod \"isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:36:51.242722 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.242712 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db934d6a-6776-4aca-8a20-86bcd0da3864-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:36:51.242998 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.242752 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db934d6a-6776-4aca-8a20-86bcd0da3864-proxy-tls\") pod \"isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:36:51.242998 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.242775 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-c3f82-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f610c89b-d09c-430d-90f0-b490ebd3d466-isvc-sklearn-graph-raw-c3f82-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:51.242998 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.242796 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f610c89b-d09c-430d-90f0-b490ebd3d466-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:51.253471 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.253442 2546 generic.go:358] "Generic (PLEG): container finished" podID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerID="30ada2641bdbeecc7edca42be130778437d1bc83d0fdf02c9fa16d2f3693cc00" exitCode=2 Apr 17 17:36:51.253622 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.253522 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" event={"ID":"a625b39f-1c6b-43ca-bb35-7cf75a7b0432","Type":"ContainerDied","Data":"30ada2641bdbeecc7edca42be130778437d1bc83d0fdf02c9fa16d2f3693cc00"} Apr 17 17:36:51.343776 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.343724 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt6m4\" (UniqueName: \"kubernetes.io/projected/f610c89b-d09c-430d-90f0-b490ebd3d466-kube-api-access-xt6m4\") pod \"isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:51.343934 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.343839 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-c3f82-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db934d6a-6776-4aca-8a20-86bcd0da3864-isvc-xgboost-graph-raw-c3f82-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:36:51.343934 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.343894 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8dph\" (UniqueName: \"kubernetes.io/projected/db934d6a-6776-4aca-8a20-86bcd0da3864-kube-api-access-l8dph\") pod \"isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:36:51.344051 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.343941 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db934d6a-6776-4aca-8a20-86bcd0da3864-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:36:51.344051 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.343971 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db934d6a-6776-4aca-8a20-86bcd0da3864-proxy-tls\") pod \"isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:36:51.344051 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.343998 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-c3f82-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f610c89b-d09c-430d-90f0-b490ebd3d466-isvc-sklearn-graph-raw-c3f82-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:51.344051 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.344026 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f610c89b-d09c-430d-90f0-b490ebd3d466-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:51.344253 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.344082 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f610c89b-d09c-430d-90f0-b490ebd3d466-proxy-tls\") pod \"isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:51.344253 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:36:51.344097 2546 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-serving-cert: secret "isvc-xgboost-graph-raw-c3f82-predictor-serving-cert" not found Apr 17 17:36:51.344253 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:36:51.344160 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db934d6a-6776-4aca-8a20-86bcd0da3864-proxy-tls podName:db934d6a-6776-4aca-8a20-86bcd0da3864 nodeName:}" failed. No retries permitted until 2026-04-17 17:36:51.844138705 +0000 UTC m=+704.466419361 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/db934d6a-6776-4aca-8a20-86bcd0da3864-proxy-tls") pod "isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" (UID: "db934d6a-6776-4aca-8a20-86bcd0da3864") : secret "isvc-xgboost-graph-raw-c3f82-predictor-serving-cert" not found Apr 17 17:36:51.344410 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.344340 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db934d6a-6776-4aca-8a20-86bcd0da3864-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:36:51.344516 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.344493 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f610c89b-d09c-430d-90f0-b490ebd3d466-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:51.344733 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.344712 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-c3f82-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db934d6a-6776-4aca-8a20-86bcd0da3864-isvc-xgboost-graph-raw-c3f82-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:36:51.344801 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.344752 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-c3f82-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f610c89b-d09c-430d-90f0-b490ebd3d466-isvc-sklearn-graph-raw-c3f82-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:51.346611 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.346591 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f610c89b-d09c-430d-90f0-b490ebd3d466-proxy-tls\") pod \"isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:51.353488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.353454 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt6m4\" (UniqueName: \"kubernetes.io/projected/f610c89b-d09c-430d-90f0-b490ebd3d466-kube-api-access-xt6m4\") pod \"isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:51.353591 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.353573 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8dph\" (UniqueName: \"kubernetes.io/projected/db934d6a-6776-4aca-8a20-86bcd0da3864-kube-api-access-l8dph\") pod \"isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:36:51.415068 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.415041 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:51.546030 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.545998 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp"] Apr 17 17:36:51.549169 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:36:51.549136 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf610c89b_d09c_430d_90f0_b490ebd3d466.slice/crio-605a4532fac29efc19d65c3d8996e84d46540a2c030b9023ed0374d2b37abe8b WatchSource:0}: Error finding container 605a4532fac29efc19d65c3d8996e84d46540a2c030b9023ed0374d2b37abe8b: Status 404 returned error can't find the container with id 605a4532fac29efc19d65c3d8996e84d46540a2c030b9023ed0374d2b37abe8b Apr 17 17:36:51.848991 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.848951 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db934d6a-6776-4aca-8a20-86bcd0da3864-proxy-tls\") pod \"isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:36:51.851384 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:51.851359 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db934d6a-6776-4aca-8a20-86bcd0da3864-proxy-tls\") pod \"isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:36:52.076752 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:52.076710 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:36:52.203328 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:52.203305 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn"] Apr 17 17:36:52.205117 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:36:52.205090 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb934d6a_6776_4aca_8a20_86bcd0da3864.slice/crio-da00db7181bbbfb0ca8b9017393954c3047f2a2df76e99908fdcff2f58ac7041 WatchSource:0}: Error finding container da00db7181bbbfb0ca8b9017393954c3047f2a2df76e99908fdcff2f58ac7041: Status 404 returned error can't find the container with id da00db7181bbbfb0ca8b9017393954c3047f2a2df76e99908fdcff2f58ac7041 Apr 17 17:36:52.257738 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:52.257656 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" event={"ID":"db934d6a-6776-4aca-8a20-86bcd0da3864","Type":"ContainerStarted","Data":"da00db7181bbbfb0ca8b9017393954c3047f2a2df76e99908fdcff2f58ac7041"} Apr 17 17:36:52.259083 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:52.259054 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" event={"ID":"f610c89b-d09c-430d-90f0-b490ebd3d466","Type":"ContainerStarted","Data":"fcdbf757283d3909ebc2afcb6efcbf6434a58e46476c438165e0b1767868ac82"} Apr 17 17:36:52.259168 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:52.259094 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" event={"ID":"f610c89b-d09c-430d-90f0-b490ebd3d466","Type":"ContainerStarted","Data":"605a4532fac29efc19d65c3d8996e84d46540a2c030b9023ed0374d2b37abe8b"} Apr 17 17:36:53.026886 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:53.026840 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 17 17:36:53.264185 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:53.264152 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" event={"ID":"db934d6a-6776-4aca-8a20-86bcd0da3864","Type":"ContainerStarted","Data":"061a790a80d075ae6c86b3f477abf59ac8f21528825eace2f3ff585f3d6f9f79"} Apr 17 17:36:55.033303 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:55.033256 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:36:55.033745 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:55.033581 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:36:55.273286 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:55.273253 2546 generic.go:358] "Generic (PLEG): container finished" podID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerID="f4347a2abc554551256a0383cf7a553a4a6ff36a198d35244be7d7dca4a464a0" exitCode=0 Apr 17 17:36:55.273471 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:55.273304 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" event={"ID":"a625b39f-1c6b-43ca-bb35-7cf75a7b0432","Type":"ContainerDied","Data":"f4347a2abc554551256a0383cf7a553a4a6ff36a198d35244be7d7dca4a464a0"} Apr 17 17:36:56.278475 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:56.278440 2546 generic.go:358] "Generic (PLEG): container finished" podID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerID="061a790a80d075ae6c86b3f477abf59ac8f21528825eace2f3ff585f3d6f9f79" exitCode=0 Apr 17 17:36:56.278907 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:56.278514 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" event={"ID":"db934d6a-6776-4aca-8a20-86bcd0da3864","Type":"ContainerDied","Data":"061a790a80d075ae6c86b3f477abf59ac8f21528825eace2f3ff585f3d6f9f79"} Apr 17 17:36:56.280021 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:56.279998 2546 generic.go:358] "Generic (PLEG): container finished" podID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerID="fcdbf757283d3909ebc2afcb6efcbf6434a58e46476c438165e0b1767868ac82" exitCode=0 Apr 17 17:36:56.280119 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:56.280069 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" event={"ID":"f610c89b-d09c-430d-90f0-b490ebd3d466","Type":"ContainerDied","Data":"fcdbf757283d3909ebc2afcb6efcbf6434a58e46476c438165e0b1767868ac82"} Apr 17 17:36:57.287028 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:57.286991 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" event={"ID":"f610c89b-d09c-430d-90f0-b490ebd3d466","Type":"ContainerStarted","Data":"a1fc0c1e0c2e6d83c89d1cbd4def20c1aedc4f409cc225be4bf11994b6972ff6"} Apr 17 17:36:57.287452 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:57.287040 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" event={"ID":"f610c89b-d09c-430d-90f0-b490ebd3d466","Type":"ContainerStarted","Data":"60ea682aba3619dab891597298194bece0e30f8b7b12cc6829222ec868887301"} Apr 17 17:36:57.287515 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:57.287460 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:57.311828 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:57.311753 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" podStartSLOduration=6.311732902 podStartE2EDuration="6.311732902s" podCreationTimestamp="2026-04-17 17:36:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:36:57.309006215 +0000 UTC m=+709.931286919" watchObservedRunningTime="2026-04-17 17:36:57.311732902 +0000 UTC m=+709.934013579" Apr 17 17:36:58.027012 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:58.026946 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 17 17:36:58.292483 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:58.292452 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:36:58.294163 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:58.294126 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 17 17:36:59.301594 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:36:59.296896 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 17 17:37:03.026415 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:03.026372 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 17 17:37:03.026904 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:03.026544 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:37:04.302310 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:04.302280 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:37:04.302970 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:04.302941 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 17 17:37:05.033457 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:05.033398 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:37:05.033837 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:05.033802 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:37:08.026219 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:08.026182 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 17 17:37:13.026553 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:13.026521 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 17 17:37:13.350052 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:13.350017 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" event={"ID":"db934d6a-6776-4aca-8a20-86bcd0da3864","Type":"ContainerStarted","Data":"0e0adba90d819784141ff3944bb9e14102f21f7bb7b9ba403f0352dde9fef01d"} Apr 17 17:37:13.350052 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:13.350057 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" event={"ID":"db934d6a-6776-4aca-8a20-86bcd0da3864","Type":"ContainerStarted","Data":"7fec421778ce1652f456cd62e9c3f3c74f8ee5418abaee6a0b8c38ad31cc49da"} Apr 17 17:37:13.350329 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:13.350317 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:37:13.370474 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:13.370429 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" podStartSLOduration=5.84770047 podStartE2EDuration="22.37041418s" podCreationTimestamp="2026-04-17 17:36:51 +0000 UTC" firstStartedPulling="2026-04-17 17:36:56.28001928 +0000 UTC m=+708.902299934" lastFinishedPulling="2026-04-17 17:37:12.802732992 +0000 UTC m=+725.425013644" observedRunningTime="2026-04-17 17:37:13.368210013 +0000 UTC m=+725.990490688" watchObservedRunningTime="2026-04-17 17:37:13.37041418 +0000 UTC m=+725.992694931" Apr 17 17:37:14.303505 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:14.303462 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 17 17:37:14.353924 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:14.353895 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:37:14.355126 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:14.355098 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 17 17:37:15.033366 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:15.033322 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:37:15.033578 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:15.033450 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:37:15.033762 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:15.033734 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:37:15.033857 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:15.033843 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:37:15.357697 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:15.357644 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 17 17:37:18.026217 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:18.026175 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 17 17:37:20.362062 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:20.362034 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:37:20.362636 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:20.362609 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 17 17:37:21.384614 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:21.384580 2546 generic.go:358] "Generic (PLEG): container finished" podID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerID="a41ef51a160c8ea85c7e3f44db2f325a019a3fc1589953da393d924060c0780e" exitCode=0 Apr 17 17:37:21.384994 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:21.384622 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" event={"ID":"a625b39f-1c6b-43ca-bb35-7cf75a7b0432","Type":"ContainerDied","Data":"a41ef51a160c8ea85c7e3f44db2f325a019a3fc1589953da393d924060c0780e"} Apr 17 17:37:21.642620 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:21.642557 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:37:21.720881 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:21.720843 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-raw-sklearn-batcher-a3ac3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-isvc-raw-sklearn-batcher-a3ac3-kube-rbac-proxy-sar-config\") pod \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " Apr 17 17:37:21.721080 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:21.720900 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-747lw\" (UniqueName: \"kubernetes.io/projected/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-kube-api-access-747lw\") pod \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " Apr 17 17:37:21.721080 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:21.720945 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-proxy-tls\") pod \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " Apr 17 17:37:21.721080 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:21.720984 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-kserve-provision-location\") pod \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\" (UID: \"a625b39f-1c6b-43ca-bb35-7cf75a7b0432\") " Apr 17 17:37:21.721358 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:21.721329 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-isvc-raw-sklearn-batcher-a3ac3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-raw-sklearn-batcher-a3ac3-kube-rbac-proxy-sar-config") pod "a625b39f-1c6b-43ca-bb35-7cf75a7b0432" (UID: "a625b39f-1c6b-43ca-bb35-7cf75a7b0432"). InnerVolumeSpecName "isvc-raw-sklearn-batcher-a3ac3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:37:21.721446 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:21.721340 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a625b39f-1c6b-43ca-bb35-7cf75a7b0432" (UID: "a625b39f-1c6b-43ca-bb35-7cf75a7b0432"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:37:21.723145 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:21.723125 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a625b39f-1c6b-43ca-bb35-7cf75a7b0432" (UID: "a625b39f-1c6b-43ca-bb35-7cf75a7b0432"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:37:21.723239 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:21.723218 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-kube-api-access-747lw" (OuterVolumeSpecName: "kube-api-access-747lw") pod "a625b39f-1c6b-43ca-bb35-7cf75a7b0432" (UID: "a625b39f-1c6b-43ca-bb35-7cf75a7b0432"). InnerVolumeSpecName "kube-api-access-747lw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:37:21.822199 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:21.822166 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-747lw\" (UniqueName: \"kubernetes.io/projected/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-kube-api-access-747lw\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:37:21.822199 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:21.822197 2546 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-proxy-tls\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:37:21.822394 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:21.822211 2546 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-kserve-provision-location\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:37:21.822394 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:21.822225 2546 reconciler_common.go:299] "Volume detached for volume \"isvc-raw-sklearn-batcher-a3ac3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a625b39f-1c6b-43ca-bb35-7cf75a7b0432-isvc-raw-sklearn-batcher-a3ac3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:37:22.390284 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:22.390260 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" Apr 17 17:37:22.390284 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:22.390265 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl" event={"ID":"a625b39f-1c6b-43ca-bb35-7cf75a7b0432","Type":"ContainerDied","Data":"da857eb8b1a09b2776cf115ed6bd0188f33e0a2e807270963faec60f0aa7d123"} Apr 17 17:37:22.390823 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:22.390311 2546 scope.go:117] "RemoveContainer" containerID="a41ef51a160c8ea85c7e3f44db2f325a019a3fc1589953da393d924060c0780e" Apr 17 17:37:22.398504 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:22.398482 2546 scope.go:117] "RemoveContainer" containerID="30ada2641bdbeecc7edca42be130778437d1bc83d0fdf02c9fa16d2f3693cc00" Apr 17 17:37:22.405512 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:22.405493 2546 scope.go:117] "RemoveContainer" containerID="f4347a2abc554551256a0383cf7a553a4a6ff36a198d35244be7d7dca4a464a0" Apr 17 17:37:22.408549 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:22.408525 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl"] Apr 17 17:37:22.412694 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:22.412662 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-a3ac3-predictor-6d555cb54f-7qgsl"] Apr 17 17:37:22.414741 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:22.414728 2546 scope.go:117] "RemoveContainer" containerID="d93d5b2792b3f6f3608d4ae6be7b50e1b6bf6ccb3530220056551cffc9399e41" Apr 17 17:37:23.954245 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:23.954209 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" path="/var/lib/kubelet/pods/a625b39f-1c6b-43ca-bb35-7cf75a7b0432/volumes" Apr 17 17:37:24.302933 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:24.302893 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 17 17:37:30.363590 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:30.363544 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 17 17:37:34.303486 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:34.303449 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 17 17:37:40.363420 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:40.363380 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 17 17:37:44.303176 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:44.303135 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 17 17:37:50.363398 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:50.363355 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 17 17:37:54.303733 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:37:54.303671 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 17 17:38:00.363486 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:00.363435 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 17 17:38:04.302947 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:04.302901 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 17 17:38:10.362589 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:10.362543 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 17 17:38:14.303837 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:14.303805 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:38:20.363593 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:20.363560 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:38:41.336869 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.336836 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp"] Apr 17 17:38:41.337329 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.337191 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kserve-container" containerID="cri-o://60ea682aba3619dab891597298194bece0e30f8b7b12cc6829222ec868887301" gracePeriod=30 Apr 17 17:38:41.337329 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.337207 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kube-rbac-proxy" containerID="cri-o://a1fc0c1e0c2e6d83c89d1cbd4def20c1aedc4f409cc225be4bf11994b6972ff6" gracePeriod=30 Apr 17 17:38:41.400453 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.400418 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl"] Apr 17 17:38:41.400826 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.400812 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" Apr 17 17:38:41.400880 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.400828 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" Apr 17 17:38:41.400880 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.400839 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kube-rbac-proxy" Apr 17 17:38:41.400880 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.400844 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kube-rbac-proxy" Apr 17 17:38:41.400880 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.400855 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="storage-initializer" Apr 17 17:38:41.400880 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.400860 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="storage-initializer" Apr 17 17:38:41.400880 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.400870 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="agent" Apr 17 17:38:41.400880 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.400876 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="agent" Apr 17 17:38:41.401084 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.400927 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="agent" Apr 17 17:38:41.401084 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.400936 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kserve-container" Apr 17 17:38:41.401084 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.400943 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="a625b39f-1c6b-43ca-bb35-7cf75a7b0432" containerName="kube-rbac-proxy" Apr 17 17:38:41.404310 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.404290 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:41.406658 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.406635 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-4dd55-predictor-serving-cert\"" Apr 17 17:38:41.406788 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.406723 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\"" Apr 17 17:38:41.415543 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.415522 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl"] Apr 17 17:38:41.498970 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.498935 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn"] Apr 17 17:38:41.499263 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.499240 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kserve-container" containerID="cri-o://7fec421778ce1652f456cd62e9c3f3c74f8ee5418abaee6a0b8c38ad31cc49da" gracePeriod=30 Apr 17 17:38:41.499364 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.499325 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kube-rbac-proxy" containerID="cri-o://0e0adba90d819784141ff3944bb9e14102f21f7bb7b9ba403f0352dde9fef01d" gracePeriod=30 Apr 17 17:38:41.523014 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.522984 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65d2bccb-9024-496e-af5a-bef4dc4b24bc-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:41.523123 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.523053 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57sps\" (UniqueName: \"kubernetes.io/projected/65d2bccb-9024-496e-af5a-bef4dc4b24bc-kube-api-access-57sps\") pod \"isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:41.523123 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.523080 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65d2bccb-9024-496e-af5a-bef4dc4b24bc-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:41.523214 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.523132 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/65d2bccb-9024-496e-af5a-bef4dc4b24bc-isvc-sklearn-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:41.623882 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.623852 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65d2bccb-9024-496e-af5a-bef4dc4b24bc-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:41.624007 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.623928 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57sps\" (UniqueName: \"kubernetes.io/projected/65d2bccb-9024-496e-af5a-bef4dc4b24bc-kube-api-access-57sps\") pod \"isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:41.624007 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.623950 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65d2bccb-9024-496e-af5a-bef4dc4b24bc-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:41.624155 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.624129 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/65d2bccb-9024-496e-af5a-bef4dc4b24bc-isvc-sklearn-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:41.624261 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.624242 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65d2bccb-9024-496e-af5a-bef4dc4b24bc-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:41.624687 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.624653 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/65d2bccb-9024-496e-af5a-bef4dc4b24bc-isvc-sklearn-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:41.626145 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.626125 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65d2bccb-9024-496e-af5a-bef4dc4b24bc-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:41.632981 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.632955 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57sps\" (UniqueName: \"kubernetes.io/projected/65d2bccb-9024-496e-af5a-bef4dc4b24bc-kube-api-access-57sps\") pod \"isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:41.674424 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.674393 2546 generic.go:358] "Generic (PLEG): container finished" podID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerID="0e0adba90d819784141ff3944bb9e14102f21f7bb7b9ba403f0352dde9fef01d" exitCode=2 Apr 17 17:38:41.674596 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.674464 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" event={"ID":"db934d6a-6776-4aca-8a20-86bcd0da3864","Type":"ContainerDied","Data":"0e0adba90d819784141ff3944bb9e14102f21f7bb7b9ba403f0352dde9fef01d"} Apr 17 17:38:41.676189 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.676167 2546 generic.go:358] "Generic (PLEG): container finished" podID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerID="a1fc0c1e0c2e6d83c89d1cbd4def20c1aedc4f409cc225be4bf11994b6972ff6" exitCode=2 Apr 17 17:38:41.676296 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.676195 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" event={"ID":"f610c89b-d09c-430d-90f0-b490ebd3d466","Type":"ContainerDied","Data":"a1fc0c1e0c2e6d83c89d1cbd4def20c1aedc4f409cc225be4bf11994b6972ff6"} Apr 17 17:38:41.716333 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.716304 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:41.841839 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:41.841814 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl"] Apr 17 17:38:41.843743 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:38:41.843710 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65d2bccb_9024_496e_af5a_bef4dc4b24bc.slice/crio-598843be310cb34fcc9637eefc43502f2ea122cf551ec99d778bd5a4952a1ae3 WatchSource:0}: Error finding container 598843be310cb34fcc9637eefc43502f2ea122cf551ec99d778bd5a4952a1ae3: Status 404 returned error can't find the container with id 598843be310cb34fcc9637eefc43502f2ea122cf551ec99d778bd5a4952a1ae3 Apr 17 17:38:42.681306 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:42.681273 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" event={"ID":"65d2bccb-9024-496e-af5a-bef4dc4b24bc","Type":"ContainerStarted","Data":"80094f1fb214bb9585d15cc66b404adeb6184d4239e645c7deb375bd8c103d16"} Apr 17 17:38:42.681306 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:42.681308 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" event={"ID":"65d2bccb-9024-496e-af5a-bef4dc4b24bc","Type":"ContainerStarted","Data":"598843be310cb34fcc9637eefc43502f2ea122cf551ec99d778bd5a4952a1ae3"} Apr 17 17:38:44.296301 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:44.296266 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 17 17:38:44.303028 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:44.303002 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 17 17:38:45.050868 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.050842 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:38:45.158612 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.158527 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db934d6a-6776-4aca-8a20-86bcd0da3864-kserve-provision-location\") pod \"db934d6a-6776-4aca-8a20-86bcd0da3864\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " Apr 17 17:38:45.158612 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.158580 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-c3f82-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db934d6a-6776-4aca-8a20-86bcd0da3864-isvc-xgboost-graph-raw-c3f82-kube-rbac-proxy-sar-config\") pod \"db934d6a-6776-4aca-8a20-86bcd0da3864\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " Apr 17 17:38:45.158875 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.158632 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8dph\" (UniqueName: \"kubernetes.io/projected/db934d6a-6776-4aca-8a20-86bcd0da3864-kube-api-access-l8dph\") pod \"db934d6a-6776-4aca-8a20-86bcd0da3864\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " Apr 17 17:38:45.158875 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.158695 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db934d6a-6776-4aca-8a20-86bcd0da3864-proxy-tls\") pod \"db934d6a-6776-4aca-8a20-86bcd0da3864\" (UID: \"db934d6a-6776-4aca-8a20-86bcd0da3864\") " Apr 17 17:38:45.158984 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.158940 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db934d6a-6776-4aca-8a20-86bcd0da3864-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "db934d6a-6776-4aca-8a20-86bcd0da3864" (UID: "db934d6a-6776-4aca-8a20-86bcd0da3864"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:45.159040 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.158993 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db934d6a-6776-4aca-8a20-86bcd0da3864-isvc-xgboost-graph-raw-c3f82-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-c3f82-kube-rbac-proxy-sar-config") pod "db934d6a-6776-4aca-8a20-86bcd0da3864" (UID: "db934d6a-6776-4aca-8a20-86bcd0da3864"). InnerVolumeSpecName "isvc-xgboost-graph-raw-c3f82-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:38:45.160830 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.160808 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db934d6a-6776-4aca-8a20-86bcd0da3864-kube-api-access-l8dph" (OuterVolumeSpecName: "kube-api-access-l8dph") pod "db934d6a-6776-4aca-8a20-86bcd0da3864" (UID: "db934d6a-6776-4aca-8a20-86bcd0da3864"). InnerVolumeSpecName "kube-api-access-l8dph". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:38:45.160922 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.160898 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db934d6a-6776-4aca-8a20-86bcd0da3864-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "db934d6a-6776-4aca-8a20-86bcd0da3864" (UID: "db934d6a-6776-4aca-8a20-86bcd0da3864"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:38:45.259853 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.259816 2546 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db934d6a-6776-4aca-8a20-86bcd0da3864-proxy-tls\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:38:45.259853 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.259849 2546 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db934d6a-6776-4aca-8a20-86bcd0da3864-kserve-provision-location\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:38:45.259853 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.259859 2546 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-c3f82-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db934d6a-6776-4aca-8a20-86bcd0da3864-isvc-xgboost-graph-raw-c3f82-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:38:45.260106 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.259869 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l8dph\" (UniqueName: \"kubernetes.io/projected/db934d6a-6776-4aca-8a20-86bcd0da3864-kube-api-access-l8dph\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:38:45.698104 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.698074 2546 generic.go:358] "Generic (PLEG): container finished" podID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerID="80094f1fb214bb9585d15cc66b404adeb6184d4239e645c7deb375bd8c103d16" exitCode=0 Apr 17 17:38:45.698578 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.698153 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" event={"ID":"65d2bccb-9024-496e-af5a-bef4dc4b24bc","Type":"ContainerDied","Data":"80094f1fb214bb9585d15cc66b404adeb6184d4239e645c7deb375bd8c103d16"} Apr 17 17:38:45.699994 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.699968 2546 generic.go:358] "Generic (PLEG): container finished" podID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerID="7fec421778ce1652f456cd62e9c3f3c74f8ee5418abaee6a0b8c38ad31cc49da" exitCode=0 Apr 17 17:38:45.700086 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.700018 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" event={"ID":"db934d6a-6776-4aca-8a20-86bcd0da3864","Type":"ContainerDied","Data":"7fec421778ce1652f456cd62e9c3f3c74f8ee5418abaee6a0b8c38ad31cc49da"} Apr 17 17:38:45.700086 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.700041 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" event={"ID":"db934d6a-6776-4aca-8a20-86bcd0da3864","Type":"ContainerDied","Data":"da00db7181bbbfb0ca8b9017393954c3047f2a2df76e99908fdcff2f58ac7041"} Apr 17 17:38:45.700086 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.700056 2546 scope.go:117] "RemoveContainer" containerID="0e0adba90d819784141ff3944bb9e14102f21f7bb7b9ba403f0352dde9fef01d" Apr 17 17:38:45.700086 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.700055 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn" Apr 17 17:38:45.728957 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.728926 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn"] Apr 17 17:38:45.734444 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.734419 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-c3f82-predictor-655c9dfdf7-767bn"] Apr 17 17:38:45.748536 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.748516 2546 scope.go:117] "RemoveContainer" containerID="7fec421778ce1652f456cd62e9c3f3c74f8ee5418abaee6a0b8c38ad31cc49da" Apr 17 17:38:45.760824 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.760800 2546 scope.go:117] "RemoveContainer" containerID="061a790a80d075ae6c86b3f477abf59ac8f21528825eace2f3ff585f3d6f9f79" Apr 17 17:38:45.777962 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.777933 2546 scope.go:117] "RemoveContainer" containerID="0e0adba90d819784141ff3944bb9e14102f21f7bb7b9ba403f0352dde9fef01d" Apr 17 17:38:45.778280 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:38:45.778257 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e0adba90d819784141ff3944bb9e14102f21f7bb7b9ba403f0352dde9fef01d\": container with ID starting with 0e0adba90d819784141ff3944bb9e14102f21f7bb7b9ba403f0352dde9fef01d not found: ID does not exist" containerID="0e0adba90d819784141ff3944bb9e14102f21f7bb7b9ba403f0352dde9fef01d" Apr 17 17:38:45.778363 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.778288 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e0adba90d819784141ff3944bb9e14102f21f7bb7b9ba403f0352dde9fef01d"} err="failed to get container status \"0e0adba90d819784141ff3944bb9e14102f21f7bb7b9ba403f0352dde9fef01d\": rpc error: code = NotFound desc = could not find container \"0e0adba90d819784141ff3944bb9e14102f21f7bb7b9ba403f0352dde9fef01d\": container with ID starting with 0e0adba90d819784141ff3944bb9e14102f21f7bb7b9ba403f0352dde9fef01d not found: ID does not exist" Apr 17 17:38:45.778363 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.778309 2546 scope.go:117] "RemoveContainer" containerID="7fec421778ce1652f456cd62e9c3f3c74f8ee5418abaee6a0b8c38ad31cc49da" Apr 17 17:38:45.778588 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:38:45.778564 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fec421778ce1652f456cd62e9c3f3c74f8ee5418abaee6a0b8c38ad31cc49da\": container with ID starting with 7fec421778ce1652f456cd62e9c3f3c74f8ee5418abaee6a0b8c38ad31cc49da not found: ID does not exist" containerID="7fec421778ce1652f456cd62e9c3f3c74f8ee5418abaee6a0b8c38ad31cc49da" Apr 17 17:38:45.778691 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.778586 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fec421778ce1652f456cd62e9c3f3c74f8ee5418abaee6a0b8c38ad31cc49da"} err="failed to get container status \"7fec421778ce1652f456cd62e9c3f3c74f8ee5418abaee6a0b8c38ad31cc49da\": rpc error: code = NotFound desc = could not find container \"7fec421778ce1652f456cd62e9c3f3c74f8ee5418abaee6a0b8c38ad31cc49da\": container with ID starting with 7fec421778ce1652f456cd62e9c3f3c74f8ee5418abaee6a0b8c38ad31cc49da not found: ID does not exist" Apr 17 17:38:45.778691 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.778603 2546 scope.go:117] "RemoveContainer" containerID="061a790a80d075ae6c86b3f477abf59ac8f21528825eace2f3ff585f3d6f9f79" Apr 17 17:38:45.778930 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:38:45.778903 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"061a790a80d075ae6c86b3f477abf59ac8f21528825eace2f3ff585f3d6f9f79\": container with ID starting with 061a790a80d075ae6c86b3f477abf59ac8f21528825eace2f3ff585f3d6f9f79 not found: ID does not exist" containerID="061a790a80d075ae6c86b3f477abf59ac8f21528825eace2f3ff585f3d6f9f79" Apr 17 17:38:45.779015 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.778937 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"061a790a80d075ae6c86b3f477abf59ac8f21528825eace2f3ff585f3d6f9f79"} err="failed to get container status \"061a790a80d075ae6c86b3f477abf59ac8f21528825eace2f3ff585f3d6f9f79\": rpc error: code = NotFound desc = could not find container \"061a790a80d075ae6c86b3f477abf59ac8f21528825eace2f3ff585f3d6f9f79\": container with ID starting with 061a790a80d075ae6c86b3f477abf59ac8f21528825eace2f3ff585f3d6f9f79 not found: ID does not exist" Apr 17 17:38:45.889759 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.889737 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:38:45.954234 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:45.954159 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" path="/var/lib/kubelet/pods/db934d6a-6776-4aca-8a20-86bcd0da3864/volumes" Apr 17 17:38:46.066908 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.066870 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt6m4\" (UniqueName: \"kubernetes.io/projected/f610c89b-d09c-430d-90f0-b490ebd3d466-kube-api-access-xt6m4\") pod \"f610c89b-d09c-430d-90f0-b490ebd3d466\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " Apr 17 17:38:46.067069 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.066949 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f610c89b-d09c-430d-90f0-b490ebd3d466-proxy-tls\") pod \"f610c89b-d09c-430d-90f0-b490ebd3d466\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " Apr 17 17:38:46.067069 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.066989 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-c3f82-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f610c89b-d09c-430d-90f0-b490ebd3d466-isvc-sklearn-graph-raw-c3f82-kube-rbac-proxy-sar-config\") pod \"f610c89b-d09c-430d-90f0-b490ebd3d466\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " Apr 17 17:38:46.067069 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.067018 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f610c89b-d09c-430d-90f0-b490ebd3d466-kserve-provision-location\") pod \"f610c89b-d09c-430d-90f0-b490ebd3d466\" (UID: \"f610c89b-d09c-430d-90f0-b490ebd3d466\") " Apr 17 17:38:46.067319 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.067295 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f610c89b-d09c-430d-90f0-b490ebd3d466-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f610c89b-d09c-430d-90f0-b490ebd3d466" (UID: "f610c89b-d09c-430d-90f0-b490ebd3d466"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:46.067393 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.067324 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f610c89b-d09c-430d-90f0-b490ebd3d466-isvc-sklearn-graph-raw-c3f82-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-c3f82-kube-rbac-proxy-sar-config") pod "f610c89b-d09c-430d-90f0-b490ebd3d466" (UID: "f610c89b-d09c-430d-90f0-b490ebd3d466"). InnerVolumeSpecName "isvc-sklearn-graph-raw-c3f82-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:38:46.068982 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.068958 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f610c89b-d09c-430d-90f0-b490ebd3d466-kube-api-access-xt6m4" (OuterVolumeSpecName: "kube-api-access-xt6m4") pod "f610c89b-d09c-430d-90f0-b490ebd3d466" (UID: "f610c89b-d09c-430d-90f0-b490ebd3d466"). InnerVolumeSpecName "kube-api-access-xt6m4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:38:46.069098 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.069066 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f610c89b-d09c-430d-90f0-b490ebd3d466-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f610c89b-d09c-430d-90f0-b490ebd3d466" (UID: "f610c89b-d09c-430d-90f0-b490ebd3d466"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:38:46.167889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.167849 2546 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-c3f82-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f610c89b-d09c-430d-90f0-b490ebd3d466-isvc-sklearn-graph-raw-c3f82-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:38:46.167889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.167881 2546 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f610c89b-d09c-430d-90f0-b490ebd3d466-kserve-provision-location\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:38:46.167889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.167892 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xt6m4\" (UniqueName: \"kubernetes.io/projected/f610c89b-d09c-430d-90f0-b490ebd3d466-kube-api-access-xt6m4\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:38:46.167889 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.167901 2546 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f610c89b-d09c-430d-90f0-b490ebd3d466-proxy-tls\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:38:46.704742 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.704705 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" event={"ID":"65d2bccb-9024-496e-af5a-bef4dc4b24bc","Type":"ContainerStarted","Data":"40b6f3993b71c2a35142d8cfb495da44d9e370f75d5d352c4d17329d276875f7"} Apr 17 17:38:46.705173 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.704754 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" event={"ID":"65d2bccb-9024-496e-af5a-bef4dc4b24bc","Type":"ContainerStarted","Data":"b99554122160908712cbaa27fceb22540cd8111e42d1d05fb5a3764a86493fc8"} Apr 17 17:38:46.705173 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.705104 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:46.705291 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.705230 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:46.706611 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.706582 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 17 17:38:46.707302 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.707284 2546 generic.go:358] "Generic (PLEG): container finished" podID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerID="60ea682aba3619dab891597298194bece0e30f8b7b12cc6829222ec868887301" exitCode=0 Apr 17 17:38:46.707363 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.707349 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" Apr 17 17:38:46.707407 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.707361 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" event={"ID":"f610c89b-d09c-430d-90f0-b490ebd3d466","Type":"ContainerDied","Data":"60ea682aba3619dab891597298194bece0e30f8b7b12cc6829222ec868887301"} Apr 17 17:38:46.707407 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.707389 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp" event={"ID":"f610c89b-d09c-430d-90f0-b490ebd3d466","Type":"ContainerDied","Data":"605a4532fac29efc19d65c3d8996e84d46540a2c030b9023ed0374d2b37abe8b"} Apr 17 17:38:46.707477 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.707407 2546 scope.go:117] "RemoveContainer" containerID="a1fc0c1e0c2e6d83c89d1cbd4def20c1aedc4f409cc225be4bf11994b6972ff6" Apr 17 17:38:46.716870 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.716855 2546 scope.go:117] "RemoveContainer" containerID="60ea682aba3619dab891597298194bece0e30f8b7b12cc6829222ec868887301" Apr 17 17:38:46.724083 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.724043 2546 scope.go:117] "RemoveContainer" containerID="fcdbf757283d3909ebc2afcb6efcbf6434a58e46476c438165e0b1767868ac82" Apr 17 17:38:46.724831 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.724773 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" podStartSLOduration=5.724759919 podStartE2EDuration="5.724759919s" podCreationTimestamp="2026-04-17 17:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:38:46.723772681 +0000 UTC m=+819.346053366" watchObservedRunningTime="2026-04-17 17:38:46.724759919 +0000 UTC m=+819.347040594" Apr 17 17:38:46.731523 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.731507 2546 scope.go:117] "RemoveContainer" containerID="a1fc0c1e0c2e6d83c89d1cbd4def20c1aedc4f409cc225be4bf11994b6972ff6" Apr 17 17:38:46.731774 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:38:46.731754 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1fc0c1e0c2e6d83c89d1cbd4def20c1aedc4f409cc225be4bf11994b6972ff6\": container with ID starting with a1fc0c1e0c2e6d83c89d1cbd4def20c1aedc4f409cc225be4bf11994b6972ff6 not found: ID does not exist" containerID="a1fc0c1e0c2e6d83c89d1cbd4def20c1aedc4f409cc225be4bf11994b6972ff6" Apr 17 17:38:46.731839 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.731786 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1fc0c1e0c2e6d83c89d1cbd4def20c1aedc4f409cc225be4bf11994b6972ff6"} err="failed to get container status \"a1fc0c1e0c2e6d83c89d1cbd4def20c1aedc4f409cc225be4bf11994b6972ff6\": rpc error: code = NotFound desc = could not find container \"a1fc0c1e0c2e6d83c89d1cbd4def20c1aedc4f409cc225be4bf11994b6972ff6\": container with ID starting with a1fc0c1e0c2e6d83c89d1cbd4def20c1aedc4f409cc225be4bf11994b6972ff6 not found: ID does not exist" Apr 17 17:38:46.731839 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.731804 2546 scope.go:117] "RemoveContainer" containerID="60ea682aba3619dab891597298194bece0e30f8b7b12cc6829222ec868887301" Apr 17 17:38:46.732045 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:38:46.732027 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60ea682aba3619dab891597298194bece0e30f8b7b12cc6829222ec868887301\": container with ID starting with 60ea682aba3619dab891597298194bece0e30f8b7b12cc6829222ec868887301 not found: ID does not exist" containerID="60ea682aba3619dab891597298194bece0e30f8b7b12cc6829222ec868887301" Apr 17 17:38:46.732087 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.732051 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60ea682aba3619dab891597298194bece0e30f8b7b12cc6829222ec868887301"} err="failed to get container status \"60ea682aba3619dab891597298194bece0e30f8b7b12cc6829222ec868887301\": rpc error: code = NotFound desc = could not find container \"60ea682aba3619dab891597298194bece0e30f8b7b12cc6829222ec868887301\": container with ID starting with 60ea682aba3619dab891597298194bece0e30f8b7b12cc6829222ec868887301 not found: ID does not exist" Apr 17 17:38:46.732087 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.732067 2546 scope.go:117] "RemoveContainer" containerID="fcdbf757283d3909ebc2afcb6efcbf6434a58e46476c438165e0b1767868ac82" Apr 17 17:38:46.732296 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:38:46.732282 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcdbf757283d3909ebc2afcb6efcbf6434a58e46476c438165e0b1767868ac82\": container with ID starting with fcdbf757283d3909ebc2afcb6efcbf6434a58e46476c438165e0b1767868ac82 not found: ID does not exist" containerID="fcdbf757283d3909ebc2afcb6efcbf6434a58e46476c438165e0b1767868ac82" Apr 17 17:38:46.732336 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.732301 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcdbf757283d3909ebc2afcb6efcbf6434a58e46476c438165e0b1767868ac82"} err="failed to get container status \"fcdbf757283d3909ebc2afcb6efcbf6434a58e46476c438165e0b1767868ac82\": rpc error: code = NotFound desc = could not find container \"fcdbf757283d3909ebc2afcb6efcbf6434a58e46476c438165e0b1767868ac82\": container with ID starting with fcdbf757283d3909ebc2afcb6efcbf6434a58e46476c438165e0b1767868ac82 not found: ID does not exist" Apr 17 17:38:46.737971 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.737946 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp"] Apr 17 17:38:46.741884 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:46.741866 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-c3f82-predictor-85bb8d8fc-vq5lp"] Apr 17 17:38:47.711795 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:47.711758 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 17 17:38:47.954009 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:47.953975 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" path="/var/lib/kubelet/pods/f610c89b-d09c-430d-90f0-b490ebd3d466/volumes" Apr 17 17:38:52.716324 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:52.716295 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:38:52.716908 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:38:52.716877 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 17 17:39:02.716834 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:39:02.716794 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 17 17:39:12.716925 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:39:12.716880 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 17 17:39:22.717207 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:39:22.717164 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 17 17:39:32.717798 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:39:32.717757 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 17 17:39:42.717728 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:39:42.717669 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 17 17:39:52.717875 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:39:52.717804 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:40:07.874094 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:07.874058 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovn-acl-logging/0.log" Apr 17 17:40:07.876077 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:07.876054 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovn-acl-logging/0.log" Apr 17 17:40:21.609085 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.609045 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl"] Apr 17 17:40:21.609896 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.609386 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kserve-container" containerID="cri-o://b99554122160908712cbaa27fceb22540cd8111e42d1d05fb5a3764a86493fc8" gracePeriod=30 Apr 17 17:40:21.609896 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.609445 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kube-rbac-proxy" containerID="cri-o://40b6f3993b71c2a35142d8cfb495da44d9e370f75d5d352c4d17329d276875f7" gracePeriod=30 Apr 17 17:40:21.651081 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651048 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5"] Apr 17 17:40:21.651417 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651405 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="storage-initializer" Apr 17 17:40:21.651459 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651418 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="storage-initializer" Apr 17 17:40:21.651459 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651428 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kserve-container" Apr 17 17:40:21.651459 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651434 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kserve-container" Apr 17 17:40:21.651459 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651442 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="storage-initializer" Apr 17 17:40:21.651459 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651448 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="storage-initializer" Apr 17 17:40:21.651690 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651471 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kserve-container" Apr 17 17:40:21.651690 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651477 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kserve-container" Apr 17 17:40:21.651690 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651484 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kube-rbac-proxy" Apr 17 17:40:21.651690 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651489 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kube-rbac-proxy" Apr 17 17:40:21.651690 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651496 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kube-rbac-proxy" Apr 17 17:40:21.651690 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651502 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kube-rbac-proxy" Apr 17 17:40:21.651690 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651551 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kube-rbac-proxy" Apr 17 17:40:21.651690 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651559 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="f610c89b-d09c-430d-90f0-b490ebd3d466" containerName="kserve-container" Apr 17 17:40:21.651690 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651566 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kserve-container" Apr 17 17:40:21.651690 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.651573 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="db934d6a-6776-4aca-8a20-86bcd0da3864" containerName="kube-rbac-proxy" Apr 17 17:40:21.654895 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.654876 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:21.657648 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.657582 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-e5a50-predictor-serving-cert\"" Apr 17 17:40:21.657648 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.657589 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-e5a50-kube-rbac-proxy-sar-config\"" Apr 17 17:40:21.666249 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.665106 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5"] Apr 17 17:40:21.798550 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.798515 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-raw-e5a50-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7b47acb-62e5-41a1-9006-06239577eae4-message-dumper-raw-e5a50-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-e5a50-predictor-85d649894b-q44k5\" (UID: \"e7b47acb-62e5-41a1-9006-06239577eae4\") " pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:21.798752 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.798570 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7b47acb-62e5-41a1-9006-06239577eae4-proxy-tls\") pod \"message-dumper-raw-e5a50-predictor-85d649894b-q44k5\" (UID: \"e7b47acb-62e5-41a1-9006-06239577eae4\") " pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:21.798799 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.798758 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzjqj\" (UniqueName: \"kubernetes.io/projected/e7b47acb-62e5-41a1-9006-06239577eae4-kube-api-access-mzjqj\") pod \"message-dumper-raw-e5a50-predictor-85d649894b-q44k5\" (UID: \"e7b47acb-62e5-41a1-9006-06239577eae4\") " pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:21.899964 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.899872 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzjqj\" (UniqueName: \"kubernetes.io/projected/e7b47acb-62e5-41a1-9006-06239577eae4-kube-api-access-mzjqj\") pod \"message-dumper-raw-e5a50-predictor-85d649894b-q44k5\" (UID: \"e7b47acb-62e5-41a1-9006-06239577eae4\") " pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:21.899964 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.899925 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-raw-e5a50-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7b47acb-62e5-41a1-9006-06239577eae4-message-dumper-raw-e5a50-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-e5a50-predictor-85d649894b-q44k5\" (UID: \"e7b47acb-62e5-41a1-9006-06239577eae4\") " pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:21.900170 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.899980 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7b47acb-62e5-41a1-9006-06239577eae4-proxy-tls\") pod \"message-dumper-raw-e5a50-predictor-85d649894b-q44k5\" (UID: \"e7b47acb-62e5-41a1-9006-06239577eae4\") " pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:21.900170 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:40:21.900130 2546 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-serving-cert: secret "message-dumper-raw-e5a50-predictor-serving-cert" not found Apr 17 17:40:21.900252 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:40:21.900197 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7b47acb-62e5-41a1-9006-06239577eae4-proxy-tls podName:e7b47acb-62e5-41a1-9006-06239577eae4 nodeName:}" failed. No retries permitted until 2026-04-17 17:40:22.400175636 +0000 UTC m=+915.022456303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e7b47acb-62e5-41a1-9006-06239577eae4-proxy-tls") pod "message-dumper-raw-e5a50-predictor-85d649894b-q44k5" (UID: "e7b47acb-62e5-41a1-9006-06239577eae4") : secret "message-dumper-raw-e5a50-predictor-serving-cert" not found Apr 17 17:40:21.900535 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.900517 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-raw-e5a50-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7b47acb-62e5-41a1-9006-06239577eae4-message-dumper-raw-e5a50-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-e5a50-predictor-85d649894b-q44k5\" (UID: \"e7b47acb-62e5-41a1-9006-06239577eae4\") " pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:21.910298 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:21.910267 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzjqj\" (UniqueName: \"kubernetes.io/projected/e7b47acb-62e5-41a1-9006-06239577eae4-kube-api-access-mzjqj\") pod \"message-dumper-raw-e5a50-predictor-85d649894b-q44k5\" (UID: \"e7b47acb-62e5-41a1-9006-06239577eae4\") " pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:22.034033 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:22.033992 2546 generic.go:358] "Generic (PLEG): container finished" podID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerID="40b6f3993b71c2a35142d8cfb495da44d9e370f75d5d352c4d17329d276875f7" exitCode=2 Apr 17 17:40:22.034196 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:22.034063 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" event={"ID":"65d2bccb-9024-496e-af5a-bef4dc4b24bc","Type":"ContainerDied","Data":"40b6f3993b71c2a35142d8cfb495da44d9e370f75d5d352c4d17329d276875f7"} Apr 17 17:40:22.403614 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:22.403566 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7b47acb-62e5-41a1-9006-06239577eae4-proxy-tls\") pod \"message-dumper-raw-e5a50-predictor-85d649894b-q44k5\" (UID: \"e7b47acb-62e5-41a1-9006-06239577eae4\") " pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:22.406083 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:22.406059 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7b47acb-62e5-41a1-9006-06239577eae4-proxy-tls\") pod \"message-dumper-raw-e5a50-predictor-85d649894b-q44k5\" (UID: \"e7b47acb-62e5-41a1-9006-06239577eae4\") " pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:22.568118 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:22.568077 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:22.694195 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:22.694170 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5"] Apr 17 17:40:22.696622 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:40:22.696588 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7b47acb_62e5_41a1_9006_06239577eae4.slice/crio-ffc55cbaeaaaf7a66301695dd7244219be4255a91ae48c4d3fb027b1c7382de2 WatchSource:0}: Error finding container ffc55cbaeaaaf7a66301695dd7244219be4255a91ae48c4d3fb027b1c7382de2: Status 404 returned error can't find the container with id ffc55cbaeaaaf7a66301695dd7244219be4255a91ae48c4d3fb027b1c7382de2 Apr 17 17:40:22.712692 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:22.712647 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.36:8643/healthz\": dial tcp 10.134.0.36:8643: connect: connection refused" Apr 17 17:40:22.716949 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:22.716926 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 17 17:40:23.037632 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:23.037597 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" event={"ID":"e7b47acb-62e5-41a1-9006-06239577eae4","Type":"ContainerStarted","Data":"ffc55cbaeaaaf7a66301695dd7244219be4255a91ae48c4d3fb027b1c7382de2"} Apr 17 17:40:24.042757 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:24.042642 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" event={"ID":"e7b47acb-62e5-41a1-9006-06239577eae4","Type":"ContainerStarted","Data":"fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55"} Apr 17 17:40:24.042757 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:24.042709 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" event={"ID":"e7b47acb-62e5-41a1-9006-06239577eae4","Type":"ContainerStarted","Data":"99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133"} Apr 17 17:40:24.043248 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:24.042818 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:24.061789 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:24.061725 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" podStartSLOduration=1.9725128330000001 podStartE2EDuration="3.061703403s" podCreationTimestamp="2026-04-17 17:40:21 +0000 UTC" firstStartedPulling="2026-04-17 17:40:22.698387936 +0000 UTC m=+915.320668589" lastFinishedPulling="2026-04-17 17:40:23.787578493 +0000 UTC m=+916.409859159" observedRunningTime="2026-04-17 17:40:24.060351563 +0000 UTC m=+916.682632238" watchObservedRunningTime="2026-04-17 17:40:24.061703403 +0000 UTC m=+916.683984072" Apr 17 17:40:25.047120 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:25.047091 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:25.049079 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:25.049056 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:25.855421 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:25.855391 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:40:26.037674 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.037647 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65d2bccb-9024-496e-af5a-bef4dc4b24bc-kserve-provision-location\") pod \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " Apr 17 17:40:26.037840 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.037700 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57sps\" (UniqueName: \"kubernetes.io/projected/65d2bccb-9024-496e-af5a-bef4dc4b24bc-kube-api-access-57sps\") pod \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " Apr 17 17:40:26.037840 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.037731 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/65d2bccb-9024-496e-af5a-bef4dc4b24bc-isvc-sklearn-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\") pod \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " Apr 17 17:40:26.037840 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.037780 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65d2bccb-9024-496e-af5a-bef4dc4b24bc-proxy-tls\") pod \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\" (UID: \"65d2bccb-9024-496e-af5a-bef4dc4b24bc\") " Apr 17 17:40:26.038057 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.038024 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65d2bccb-9024-496e-af5a-bef4dc4b24bc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "65d2bccb-9024-496e-af5a-bef4dc4b24bc" (UID: "65d2bccb-9024-496e-af5a-bef4dc4b24bc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:40:26.038114 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.038089 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65d2bccb-9024-496e-af5a-bef4dc4b24bc-isvc-sklearn-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config") pod "65d2bccb-9024-496e-af5a-bef4dc4b24bc" (UID: "65d2bccb-9024-496e-af5a-bef4dc4b24bc"). InnerVolumeSpecName "isvc-sklearn-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:40:26.039833 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.039809 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d2bccb-9024-496e-af5a-bef4dc4b24bc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "65d2bccb-9024-496e-af5a-bef4dc4b24bc" (UID: "65d2bccb-9024-496e-af5a-bef4dc4b24bc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:40:26.039833 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.039820 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65d2bccb-9024-496e-af5a-bef4dc4b24bc-kube-api-access-57sps" (OuterVolumeSpecName: "kube-api-access-57sps") pod "65d2bccb-9024-496e-af5a-bef4dc4b24bc" (UID: "65d2bccb-9024-496e-af5a-bef4dc4b24bc"). InnerVolumeSpecName "kube-api-access-57sps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:40:26.052428 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.052406 2546 generic.go:358] "Generic (PLEG): container finished" podID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerID="b99554122160908712cbaa27fceb22540cd8111e42d1d05fb5a3764a86493fc8" exitCode=0 Apr 17 17:40:26.052737 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.052493 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" Apr 17 17:40:26.052737 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.052492 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" event={"ID":"65d2bccb-9024-496e-af5a-bef4dc4b24bc","Type":"ContainerDied","Data":"b99554122160908712cbaa27fceb22540cd8111e42d1d05fb5a3764a86493fc8"} Apr 17 17:40:26.052737 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.052535 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl" event={"ID":"65d2bccb-9024-496e-af5a-bef4dc4b24bc","Type":"ContainerDied","Data":"598843be310cb34fcc9637eefc43502f2ea122cf551ec99d778bd5a4952a1ae3"} Apr 17 17:40:26.052737 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.052555 2546 scope.go:117] "RemoveContainer" containerID="40b6f3993b71c2a35142d8cfb495da44d9e370f75d5d352c4d17329d276875f7" Apr 17 17:40:26.060592 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.060576 2546 scope.go:117] "RemoveContainer" containerID="b99554122160908712cbaa27fceb22540cd8111e42d1d05fb5a3764a86493fc8" Apr 17 17:40:26.067656 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.067635 2546 scope.go:117] "RemoveContainer" containerID="80094f1fb214bb9585d15cc66b404adeb6184d4239e645c7deb375bd8c103d16" Apr 17 17:40:26.072968 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.072945 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl"] Apr 17 17:40:26.075536 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.075514 2546 scope.go:117] "RemoveContainer" containerID="40b6f3993b71c2a35142d8cfb495da44d9e370f75d5d352c4d17329d276875f7" Apr 17 17:40:26.075947 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:40:26.075921 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40b6f3993b71c2a35142d8cfb495da44d9e370f75d5d352c4d17329d276875f7\": container with ID starting with 40b6f3993b71c2a35142d8cfb495da44d9e370f75d5d352c4d17329d276875f7 not found: ID does not exist" containerID="40b6f3993b71c2a35142d8cfb495da44d9e370f75d5d352c4d17329d276875f7" Apr 17 17:40:26.076022 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.075956 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40b6f3993b71c2a35142d8cfb495da44d9e370f75d5d352c4d17329d276875f7"} err="failed to get container status \"40b6f3993b71c2a35142d8cfb495da44d9e370f75d5d352c4d17329d276875f7\": rpc error: code = NotFound desc = could not find container \"40b6f3993b71c2a35142d8cfb495da44d9e370f75d5d352c4d17329d276875f7\": container with ID starting with 40b6f3993b71c2a35142d8cfb495da44d9e370f75d5d352c4d17329d276875f7 not found: ID does not exist" Apr 17 17:40:26.076022 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.075980 2546 scope.go:117] "RemoveContainer" containerID="b99554122160908712cbaa27fceb22540cd8111e42d1d05fb5a3764a86493fc8" Apr 17 17:40:26.076254 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:40:26.076232 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b99554122160908712cbaa27fceb22540cd8111e42d1d05fb5a3764a86493fc8\": container with ID starting with b99554122160908712cbaa27fceb22540cd8111e42d1d05fb5a3764a86493fc8 not found: ID does not exist" containerID="b99554122160908712cbaa27fceb22540cd8111e42d1d05fb5a3764a86493fc8" Apr 17 17:40:26.076303 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.076270 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b99554122160908712cbaa27fceb22540cd8111e42d1d05fb5a3764a86493fc8"} err="failed to get container status \"b99554122160908712cbaa27fceb22540cd8111e42d1d05fb5a3764a86493fc8\": rpc error: code = NotFound desc = could not find container \"b99554122160908712cbaa27fceb22540cd8111e42d1d05fb5a3764a86493fc8\": container with ID starting with b99554122160908712cbaa27fceb22540cd8111e42d1d05fb5a3764a86493fc8 not found: ID does not exist" Apr 17 17:40:26.076303 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.076288 2546 scope.go:117] "RemoveContainer" containerID="80094f1fb214bb9585d15cc66b404adeb6184d4239e645c7deb375bd8c103d16" Apr 17 17:40:26.076545 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:40:26.076528 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80094f1fb214bb9585d15cc66b404adeb6184d4239e645c7deb375bd8c103d16\": container with ID starting with 80094f1fb214bb9585d15cc66b404adeb6184d4239e645c7deb375bd8c103d16 not found: ID does not exist" containerID="80094f1fb214bb9585d15cc66b404adeb6184d4239e645c7deb375bd8c103d16" Apr 17 17:40:26.076602 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.076550 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80094f1fb214bb9585d15cc66b404adeb6184d4239e645c7deb375bd8c103d16"} err="failed to get container status \"80094f1fb214bb9585d15cc66b404adeb6184d4239e645c7deb375bd8c103d16\": rpc error: code = NotFound desc = could not find container \"80094f1fb214bb9585d15cc66b404adeb6184d4239e645c7deb375bd8c103d16\": container with ID starting with 80094f1fb214bb9585d15cc66b404adeb6184d4239e645c7deb375bd8c103d16 not found: ID does not exist" Apr 17 17:40:26.076944 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.076929 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4dd55-predictor-578bc84689-j5vhl"] Apr 17 17:40:26.138956 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.138931 2546 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65d2bccb-9024-496e-af5a-bef4dc4b24bc-proxy-tls\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:40:26.138956 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.138955 2546 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65d2bccb-9024-496e-af5a-bef4dc4b24bc-kserve-provision-location\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:40:26.139130 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.138966 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-57sps\" (UniqueName: \"kubernetes.io/projected/65d2bccb-9024-496e-af5a-bef4dc4b24bc-kube-api-access-57sps\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:40:26.139130 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:26.138978 2546 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/65d2bccb-9024-496e-af5a-bef4dc4b24bc-isvc-sklearn-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:40:27.953836 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:27.953806 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" path="/var/lib/kubelet/pods/65d2bccb-9024-496e-af5a-bef4dc4b24bc/volumes" Apr 17 17:40:32.060498 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:32.060472 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:40:41.699582 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.699547 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff"] Apr 17 17:40:41.702024 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.700024 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kserve-container" Apr 17 17:40:41.702024 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.700038 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kserve-container" Apr 17 17:40:41.702024 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.700048 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kube-rbac-proxy" Apr 17 17:40:41.702024 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.700075 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kube-rbac-proxy" Apr 17 17:40:41.702024 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.700090 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="storage-initializer" Apr 17 17:40:41.702024 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.700098 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="storage-initializer" Apr 17 17:40:41.702024 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.700155 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kube-rbac-proxy" Apr 17 17:40:41.702024 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.700172 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="65d2bccb-9024-496e-af5a-bef4dc4b24bc" containerName="kserve-container" Apr 17 17:40:41.703223 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.703204 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:41.705713 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.705669 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-e5a50-predictor-serving-cert\"" Apr 17 17:40:41.705841 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.705780 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-e5a50-kube-rbac-proxy-sar-config\"" Apr 17 17:40:41.715317 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.715294 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff"] Apr 17 17:40:41.768001 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.767974 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed3a4414-d204-47e5-93a3-c2b560b48283-proxy-tls\") pod \"isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:41.768001 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.768004 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed3a4414-d204-47e5-93a3-c2b560b48283-kserve-provision-location\") pod \"isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:41.768199 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.768029 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-raw-e5a50-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed3a4414-d204-47e5-93a3-c2b560b48283-isvc-logger-raw-e5a50-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:41.768199 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.768125 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmjjz\" (UniqueName: \"kubernetes.io/projected/ed3a4414-d204-47e5-93a3-c2b560b48283-kube-api-access-hmjjz\") pod \"isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:41.868903 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.868868 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed3a4414-d204-47e5-93a3-c2b560b48283-proxy-tls\") pod \"isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:41.868903 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.868904 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed3a4414-d204-47e5-93a3-c2b560b48283-kserve-provision-location\") pod \"isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:41.869109 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.868928 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-raw-e5a50-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed3a4414-d204-47e5-93a3-c2b560b48283-isvc-logger-raw-e5a50-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:41.869109 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.868964 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmjjz\" (UniqueName: \"kubernetes.io/projected/ed3a4414-d204-47e5-93a3-c2b560b48283-kube-api-access-hmjjz\") pod \"isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:41.869109 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:40:41.869039 2546 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-serving-cert: secret "isvc-logger-raw-e5a50-predictor-serving-cert" not found Apr 17 17:40:41.869231 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:40:41.869113 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed3a4414-d204-47e5-93a3-c2b560b48283-proxy-tls podName:ed3a4414-d204-47e5-93a3-c2b560b48283 nodeName:}" failed. No retries permitted until 2026-04-17 17:40:42.369089734 +0000 UTC m=+934.991370407 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ed3a4414-d204-47e5-93a3-c2b560b48283-proxy-tls") pod "isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" (UID: "ed3a4414-d204-47e5-93a3-c2b560b48283") : secret "isvc-logger-raw-e5a50-predictor-serving-cert" not found Apr 17 17:40:41.869340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.869323 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed3a4414-d204-47e5-93a3-c2b560b48283-kserve-provision-location\") pod \"isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:41.869574 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.869555 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-raw-e5a50-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed3a4414-d204-47e5-93a3-c2b560b48283-isvc-logger-raw-e5a50-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:41.877518 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:41.877490 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmjjz\" (UniqueName: \"kubernetes.io/projected/ed3a4414-d204-47e5-93a3-c2b560b48283-kube-api-access-hmjjz\") pod \"isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:42.373727 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:42.373660 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed3a4414-d204-47e5-93a3-c2b560b48283-proxy-tls\") pod \"isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:42.376081 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:42.376054 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed3a4414-d204-47e5-93a3-c2b560b48283-proxy-tls\") pod \"isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:42.614269 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:42.614232 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:42.746073 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:42.746035 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff"] Apr 17 17:40:42.748692 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:40:42.748647 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded3a4414_d204_47e5_93a3_c2b560b48283.slice/crio-1011e7cf3d2e2fed75912afaced2a449697d32135112fd06c51276c398913370 WatchSource:0}: Error finding container 1011e7cf3d2e2fed75912afaced2a449697d32135112fd06c51276c398913370: Status 404 returned error can't find the container with id 1011e7cf3d2e2fed75912afaced2a449697d32135112fd06c51276c398913370 Apr 17 17:40:42.750830 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:42.750811 2546 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:40:43.109537 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:43.109499 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" event={"ID":"ed3a4414-d204-47e5-93a3-c2b560b48283","Type":"ContainerStarted","Data":"f2f9b0ebd753233c38f6c8459783dc06b31634592a8b2b8be505fdc1978c8e2c"} Apr 17 17:40:43.109537 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:43.109536 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" event={"ID":"ed3a4414-d204-47e5-93a3-c2b560b48283","Type":"ContainerStarted","Data":"1011e7cf3d2e2fed75912afaced2a449697d32135112fd06c51276c398913370"} Apr 17 17:40:47.122960 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:47.122922 2546 generic.go:358] "Generic (PLEG): container finished" podID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerID="f2f9b0ebd753233c38f6c8459783dc06b31634592a8b2b8be505fdc1978c8e2c" exitCode=0 Apr 17 17:40:47.123334 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:47.122995 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" event={"ID":"ed3a4414-d204-47e5-93a3-c2b560b48283","Type":"ContainerDied","Data":"f2f9b0ebd753233c38f6c8459783dc06b31634592a8b2b8be505fdc1978c8e2c"} Apr 17 17:40:48.127807 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:48.127768 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" event={"ID":"ed3a4414-d204-47e5-93a3-c2b560b48283","Type":"ContainerStarted","Data":"63dd93751a316ea1d458ebfc813f9515724d0087e97e97766aa5a9af63b1a442"} Apr 17 17:40:48.127807 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:48.127810 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" event={"ID":"ed3a4414-d204-47e5-93a3-c2b560b48283","Type":"ContainerStarted","Data":"e2719c614403ea408912588a3fbb417329ef0f79aed402ed00663d5f03d9c0df"} Apr 17 17:40:48.128221 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:48.127822 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" event={"ID":"ed3a4414-d204-47e5-93a3-c2b560b48283","Type":"ContainerStarted","Data":"559efb716d7eaa5ef5d03b5820d75d87bed44fa1fae269a5c6a3a8874913e9ae"} Apr 17 17:40:48.128221 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:48.128117 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:48.128318 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:48.128256 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:48.129447 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:48.129421 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 17 17:40:48.148798 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:48.148754 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podStartSLOduration=7.148742511 podStartE2EDuration="7.148742511s" podCreationTimestamp="2026-04-17 17:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:40:48.14781493 +0000 UTC m=+940.770095606" watchObservedRunningTime="2026-04-17 17:40:48.148742511 +0000 UTC m=+940.771023186" Apr 17 17:40:49.131804 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:49.131754 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:49.132305 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:49.131811 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 17 17:40:49.133042 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:49.133016 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:40:50.140527 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:50.140480 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 17 17:40:50.140970 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:50.140939 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:40:55.144456 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:55.144430 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:40:55.144967 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:55.144937 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 17 17:40:55.145328 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:40:55.145303 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:41:05.145009 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:41:05.144948 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 17 17:41:05.145597 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:41:05.145437 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:41:15.145711 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:41:15.145650 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 17 17:41:15.146221 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:41:15.146193 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:41:25.145327 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:41:25.145283 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 17 17:41:25.145839 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:41:25.145813 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:41:35.144930 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:41:35.144882 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 17 17:41:35.145401 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:41:35.145376 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:41:45.145768 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:41:45.145711 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 17 17:41:45.146375 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:41:45.146073 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:41:55.146351 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:41:55.146317 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:41:55.146856 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:41:55.146520 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:42:06.731330 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:06.731297 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-e5a50-predictor-85d649894b-q44k5_e7b47acb-62e5-41a1-9006-06239577eae4/kserve-container/0.log" Apr 17 17:42:06.924587 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:06.924545 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff"] Apr 17 17:42:06.925106 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:06.925048 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" containerID="cri-o://559efb716d7eaa5ef5d03b5820d75d87bed44fa1fae269a5c6a3a8874913e9ae" gracePeriod=30 Apr 17 17:42:06.925667 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:06.925445 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="agent" containerID="cri-o://63dd93751a316ea1d458ebfc813f9515724d0087e97e97766aa5a9af63b1a442" gracePeriod=30 Apr 17 17:42:06.925667 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:06.925497 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kube-rbac-proxy" containerID="cri-o://e2719c614403ea408912588a3fbb417329ef0f79aed402ed00663d5f03d9c0df" gracePeriod=30 Apr 17 17:42:07.005834 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.005757 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh"] Apr 17 17:42:07.009672 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.009651 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:07.012347 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.012327 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-2162a-predictor-serving-cert\"" Apr 17 17:42:07.012479 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.012349 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-2162a-kube-rbac-proxy-sar-config\"" Apr 17 17:42:07.023595 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.023566 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh"] Apr 17 17:42:07.031791 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.031755 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5"] Apr 17 17:42:07.032071 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.032048 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" podUID="e7b47acb-62e5-41a1-9006-06239577eae4" containerName="kserve-container" containerID="cri-o://99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133" gracePeriod=30 Apr 17 17:42:07.032196 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.032153 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" podUID="e7b47acb-62e5-41a1-9006-06239577eae4" containerName="kube-rbac-proxy" containerID="cri-o://fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55" gracePeriod=30 Apr 17 17:42:07.034711 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.034567 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-proxy-tls\") pod \"isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:07.034711 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.034609 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:07.035230 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.034956 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk748\" (UniqueName: \"kubernetes.io/projected/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-kube-api-access-dk748\") pod \"isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:07.035501 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.035399 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-scale-raw-2162a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-isvc-sklearn-scale-raw-2162a-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:07.057831 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.057784 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" podUID="e7b47acb-62e5-41a1-9006-06239577eae4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 17 17:42:07.136157 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.136126 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:07.136335 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.136250 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk748\" (UniqueName: \"kubernetes.io/projected/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-kube-api-access-dk748\") pod \"isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:07.136335 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.136313 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-scale-raw-2162a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-isvc-sklearn-scale-raw-2162a-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:07.136462 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.136347 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-proxy-tls\") pod \"isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:07.136543 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.136506 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:07.137022 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.136998 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-scale-raw-2162a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-isvc-sklearn-scale-raw-2162a-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:07.138927 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.138901 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-proxy-tls\") pod \"isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:07.145408 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.145385 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk748\" (UniqueName: \"kubernetes.io/projected/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-kube-api-access-dk748\") pod \"isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:07.266822 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.266800 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:42:07.321005 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.320962 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:07.337953 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.337925 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7b47acb-62e5-41a1-9006-06239577eae4-proxy-tls\") pod \"e7b47acb-62e5-41a1-9006-06239577eae4\" (UID: \"e7b47acb-62e5-41a1-9006-06239577eae4\") " Apr 17 17:42:07.338079 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.337988 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzjqj\" (UniqueName: \"kubernetes.io/projected/e7b47acb-62e5-41a1-9006-06239577eae4-kube-api-access-mzjqj\") pod \"e7b47acb-62e5-41a1-9006-06239577eae4\" (UID: \"e7b47acb-62e5-41a1-9006-06239577eae4\") " Apr 17 17:42:07.338079 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.338047 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-raw-e5a50-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7b47acb-62e5-41a1-9006-06239577eae4-message-dumper-raw-e5a50-kube-rbac-proxy-sar-config\") pod \"e7b47acb-62e5-41a1-9006-06239577eae4\" (UID: \"e7b47acb-62e5-41a1-9006-06239577eae4\") " Apr 17 17:42:07.338472 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.338439 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b47acb-62e5-41a1-9006-06239577eae4-message-dumper-raw-e5a50-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-raw-e5a50-kube-rbac-proxy-sar-config") pod "e7b47acb-62e5-41a1-9006-06239577eae4" (UID: "e7b47acb-62e5-41a1-9006-06239577eae4"). InnerVolumeSpecName "message-dumper-raw-e5a50-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:42:07.340084 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.340056 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b47acb-62e5-41a1-9006-06239577eae4-kube-api-access-mzjqj" (OuterVolumeSpecName: "kube-api-access-mzjqj") pod "e7b47acb-62e5-41a1-9006-06239577eae4" (UID: "e7b47acb-62e5-41a1-9006-06239577eae4"). InnerVolumeSpecName "kube-api-access-mzjqj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:42:07.340565 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.340540 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b47acb-62e5-41a1-9006-06239577eae4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e7b47acb-62e5-41a1-9006-06239577eae4" (UID: "e7b47acb-62e5-41a1-9006-06239577eae4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:42:07.403317 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.403283 2546 generic.go:358] "Generic (PLEG): container finished" podID="e7b47acb-62e5-41a1-9006-06239577eae4" containerID="fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55" exitCode=2 Apr 17 17:42:07.403317 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.403313 2546 generic.go:358] "Generic (PLEG): container finished" podID="e7b47acb-62e5-41a1-9006-06239577eae4" containerID="99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133" exitCode=2 Apr 17 17:42:07.403538 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.403361 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" event={"ID":"e7b47acb-62e5-41a1-9006-06239577eae4","Type":"ContainerDied","Data":"fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55"} Apr 17 17:42:07.403538 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.403381 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" Apr 17 17:42:07.403538 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.403402 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" event={"ID":"e7b47acb-62e5-41a1-9006-06239577eae4","Type":"ContainerDied","Data":"99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133"} Apr 17 17:42:07.403538 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.403418 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5" event={"ID":"e7b47acb-62e5-41a1-9006-06239577eae4","Type":"ContainerDied","Data":"ffc55cbaeaaaf7a66301695dd7244219be4255a91ae48c4d3fb027b1c7382de2"} Apr 17 17:42:07.403538 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.403468 2546 scope.go:117] "RemoveContainer" containerID="fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55" Apr 17 17:42:07.406379 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.406354 2546 generic.go:358] "Generic (PLEG): container finished" podID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerID="e2719c614403ea408912588a3fbb417329ef0f79aed402ed00663d5f03d9c0df" exitCode=2 Apr 17 17:42:07.406486 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.406428 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" event={"ID":"ed3a4414-d204-47e5-93a3-c2b560b48283","Type":"ContainerDied","Data":"e2719c614403ea408912588a3fbb417329ef0f79aed402ed00663d5f03d9c0df"} Apr 17 17:42:07.412738 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.412716 2546 scope.go:117] "RemoveContainer" containerID="99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133" Apr 17 17:42:07.421328 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.421307 2546 scope.go:117] "RemoveContainer" containerID="fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55" Apr 17 17:42:07.421643 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:42:07.421617 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55\": container with ID starting with fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55 not found: ID does not exist" containerID="fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55" Apr 17 17:42:07.421755 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.421655 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55"} err="failed to get container status \"fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55\": rpc error: code = NotFound desc = could not find container \"fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55\": container with ID starting with fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55 not found: ID does not exist" Apr 17 17:42:07.421755 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.421696 2546 scope.go:117] "RemoveContainer" containerID="99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133" Apr 17 17:42:07.421927 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:42:07.421905 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133\": container with ID starting with 99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133 not found: ID does not exist" containerID="99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133" Apr 17 17:42:07.421991 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.421932 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133"} err="failed to get container status \"99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133\": rpc error: code = NotFound desc = could not find container \"99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133\": container with ID starting with 99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133 not found: ID does not exist" Apr 17 17:42:07.421991 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.421949 2546 scope.go:117] "RemoveContainer" containerID="fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55" Apr 17 17:42:07.422196 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.422178 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55"} err="failed to get container status \"fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55\": rpc error: code = NotFound desc = could not find container \"fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55\": container with ID starting with fec2ef2096de85651f54b40afb8e4c25faa7b4294306a122876a793c9bd87c55 not found: ID does not exist" Apr 17 17:42:07.422267 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.422196 2546 scope.go:117] "RemoveContainer" containerID="99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133" Apr 17 17:42:07.422421 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.422401 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133"} err="failed to get container status \"99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133\": rpc error: code = NotFound desc = could not find container \"99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133\": container with ID starting with 99ab7f0dba77b7db1896d5e3beb10d18acd594d2ebe94258e388ebdc7e894133 not found: ID does not exist" Apr 17 17:42:07.429004 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.428978 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5"] Apr 17 17:42:07.433728 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.433705 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-e5a50-predictor-85d649894b-q44k5"] Apr 17 17:42:07.439060 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.439041 2546 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7b47acb-62e5-41a1-9006-06239577eae4-proxy-tls\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:42:07.439060 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.439062 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mzjqj\" (UniqueName: \"kubernetes.io/projected/e7b47acb-62e5-41a1-9006-06239577eae4-kube-api-access-mzjqj\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:42:07.439183 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.439073 2546 reconciler_common.go:299] "Volume detached for volume \"message-dumper-raw-e5a50-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7b47acb-62e5-41a1-9006-06239577eae4-message-dumper-raw-e5a50-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:42:07.454130 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.454107 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh"] Apr 17 17:42:07.456517 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:42:07.456488 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d20f797_5ab8_4774_8a1c_67e9a5f8796b.slice/crio-5ceb543c752c81a0c556b46660960b5d78758e87b01255f6794cca4c5d04e92e WatchSource:0}: Error finding container 5ceb543c752c81a0c556b46660960b5d78758e87b01255f6794cca4c5d04e92e: Status 404 returned error can't find the container with id 5ceb543c752c81a0c556b46660960b5d78758e87b01255f6794cca4c5d04e92e Apr 17 17:42:07.955311 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:07.955275 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b47acb-62e5-41a1-9006-06239577eae4" path="/var/lib/kubelet/pods/e7b47acb-62e5-41a1-9006-06239577eae4/volumes" Apr 17 17:42:08.412159 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:08.412124 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" event={"ID":"3d20f797-5ab8-4774-8a1c-67e9a5f8796b","Type":"ContainerStarted","Data":"4ba39a90a6164a4e82e10b36e3e0b565d50eb0cd0a3584eb228584b69cc7d8a4"} Apr 17 17:42:08.412159 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:08.412163 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" event={"ID":"3d20f797-5ab8-4774-8a1c-67e9a5f8796b","Type":"ContainerStarted","Data":"5ceb543c752c81a0c556b46660960b5d78758e87b01255f6794cca4c5d04e92e"} Apr 17 17:42:10.141235 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:10.141192 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.38:8643/healthz\": dial tcp 10.134.0.38:8643: connect: connection refused" Apr 17 17:42:11.425501 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:11.425391 2546 generic.go:358] "Generic (PLEG): container finished" podID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerID="559efb716d7eaa5ef5d03b5820d75d87bed44fa1fae269a5c6a3a8874913e9ae" exitCode=0 Apr 17 17:42:11.425501 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:11.425452 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" event={"ID":"ed3a4414-d204-47e5-93a3-c2b560b48283","Type":"ContainerDied","Data":"559efb716d7eaa5ef5d03b5820d75d87bed44fa1fae269a5c6a3a8874913e9ae"} Apr 17 17:42:11.426799 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:11.426775 2546 generic.go:358] "Generic (PLEG): container finished" podID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerID="4ba39a90a6164a4e82e10b36e3e0b565d50eb0cd0a3584eb228584b69cc7d8a4" exitCode=0 Apr 17 17:42:11.426921 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:11.426828 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" event={"ID":"3d20f797-5ab8-4774-8a1c-67e9a5f8796b","Type":"ContainerDied","Data":"4ba39a90a6164a4e82e10b36e3e0b565d50eb0cd0a3584eb228584b69cc7d8a4"} Apr 17 17:42:12.431989 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:12.431953 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" event={"ID":"3d20f797-5ab8-4774-8a1c-67e9a5f8796b","Type":"ContainerStarted","Data":"a84cf26e64d666b82999e6b087a31c1b729330c560b75633424b17d1b0598de2"} Apr 17 17:42:12.432378 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:12.431998 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" event={"ID":"3d20f797-5ab8-4774-8a1c-67e9a5f8796b","Type":"ContainerStarted","Data":"f23a87d72cd06fa798ea1380de8b1401b3903d50fcaec5dfdc7bdb7429232973"} Apr 17 17:42:12.432378 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:12.432209 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:12.455005 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:12.454959 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podStartSLOduration=6.454945462 podStartE2EDuration="6.454945462s" podCreationTimestamp="2026-04-17 17:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:42:12.453238331 +0000 UTC m=+1025.075519012" watchObservedRunningTime="2026-04-17 17:42:12.454945462 +0000 UTC m=+1025.077226137" Apr 17 17:42:13.435052 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:13.435025 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:13.436257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:13.436231 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:42:14.439022 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:14.438977 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:42:15.141250 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:15.141204 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.38:8643/healthz\": dial tcp 10.134.0.38:8643: connect: connection refused" Apr 17 17:42:15.145506 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:15.145475 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 17 17:42:15.145940 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:15.145911 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:42:19.442865 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:19.442832 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:42:19.443414 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:19.443389 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:42:20.140603 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:20.140562 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.38:8643/healthz\": dial tcp 10.134.0.38:8643: connect: connection refused" Apr 17 17:42:20.140811 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:20.140713 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:42:25.141366 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:25.141318 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.38:8643/healthz\": dial tcp 10.134.0.38:8643: connect: connection refused" Apr 17 17:42:25.145666 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:25.145632 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 17 17:42:25.146074 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:25.146038 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:42:29.443325 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:29.443279 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:42:30.140785 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:30.140743 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.38:8643/healthz\": dial tcp 10.134.0.38:8643: connect: connection refused" Apr 17 17:42:35.140965 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:35.140921 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.38:8643/healthz\": dial tcp 10.134.0.38:8643: connect: connection refused" Apr 17 17:42:35.145275 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:35.145233 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 17 17:42:35.145424 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:35.145377 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:42:35.145593 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:35.145561 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:42:35.145733 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:35.145718 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:42:37.077170 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.077143 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:42:37.207487 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.207397 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmjjz\" (UniqueName: \"kubernetes.io/projected/ed3a4414-d204-47e5-93a3-c2b560b48283-kube-api-access-hmjjz\") pod \"ed3a4414-d204-47e5-93a3-c2b560b48283\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " Apr 17 17:42:37.207655 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.207498 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed3a4414-d204-47e5-93a3-c2b560b48283-proxy-tls\") pod \"ed3a4414-d204-47e5-93a3-c2b560b48283\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " Apr 17 17:42:37.207655 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.207526 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-raw-e5a50-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed3a4414-d204-47e5-93a3-c2b560b48283-isvc-logger-raw-e5a50-kube-rbac-proxy-sar-config\") pod \"ed3a4414-d204-47e5-93a3-c2b560b48283\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " Apr 17 17:42:37.207655 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.207556 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed3a4414-d204-47e5-93a3-c2b560b48283-kserve-provision-location\") pod \"ed3a4414-d204-47e5-93a3-c2b560b48283\" (UID: \"ed3a4414-d204-47e5-93a3-c2b560b48283\") " Apr 17 17:42:37.207966 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.207924 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed3a4414-d204-47e5-93a3-c2b560b48283-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ed3a4414-d204-47e5-93a3-c2b560b48283" (UID: "ed3a4414-d204-47e5-93a3-c2b560b48283"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:42:37.207966 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.207930 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed3a4414-d204-47e5-93a3-c2b560b48283-isvc-logger-raw-e5a50-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-raw-e5a50-kube-rbac-proxy-sar-config") pod "ed3a4414-d204-47e5-93a3-c2b560b48283" (UID: "ed3a4414-d204-47e5-93a3-c2b560b48283"). InnerVolumeSpecName "isvc-logger-raw-e5a50-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:42:37.209443 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.209419 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed3a4414-d204-47e5-93a3-c2b560b48283-kube-api-access-hmjjz" (OuterVolumeSpecName: "kube-api-access-hmjjz") pod "ed3a4414-d204-47e5-93a3-c2b560b48283" (UID: "ed3a4414-d204-47e5-93a3-c2b560b48283"). InnerVolumeSpecName "kube-api-access-hmjjz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:42:37.209598 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.209576 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed3a4414-d204-47e5-93a3-c2b560b48283-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ed3a4414-d204-47e5-93a3-c2b560b48283" (UID: "ed3a4414-d204-47e5-93a3-c2b560b48283"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:42:37.308890 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.308841 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hmjjz\" (UniqueName: \"kubernetes.io/projected/ed3a4414-d204-47e5-93a3-c2b560b48283-kube-api-access-hmjjz\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:42:37.308890 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.308885 2546 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed3a4414-d204-47e5-93a3-c2b560b48283-proxy-tls\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:42:37.308890 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.308898 2546 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-raw-e5a50-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed3a4414-d204-47e5-93a3-c2b560b48283-isvc-logger-raw-e5a50-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:42:37.308890 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.308909 2546 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed3a4414-d204-47e5-93a3-c2b560b48283-kserve-provision-location\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:42:37.518970 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.518878 2546 generic.go:358] "Generic (PLEG): container finished" podID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerID="63dd93751a316ea1d458ebfc813f9515724d0087e97e97766aa5a9af63b1a442" exitCode=0 Apr 17 17:42:37.519138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.518959 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" event={"ID":"ed3a4414-d204-47e5-93a3-c2b560b48283","Type":"ContainerDied","Data":"63dd93751a316ea1d458ebfc813f9515724d0087e97e97766aa5a9af63b1a442"} Apr 17 17:42:37.519138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.518978 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" Apr 17 17:42:37.519138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.519000 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff" event={"ID":"ed3a4414-d204-47e5-93a3-c2b560b48283","Type":"ContainerDied","Data":"1011e7cf3d2e2fed75912afaced2a449697d32135112fd06c51276c398913370"} Apr 17 17:42:37.519138 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.519017 2546 scope.go:117] "RemoveContainer" containerID="63dd93751a316ea1d458ebfc813f9515724d0087e97e97766aa5a9af63b1a442" Apr 17 17:42:37.527875 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.527858 2546 scope.go:117] "RemoveContainer" containerID="e2719c614403ea408912588a3fbb417329ef0f79aed402ed00663d5f03d9c0df" Apr 17 17:42:37.535071 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.535049 2546 scope.go:117] "RemoveContainer" containerID="559efb716d7eaa5ef5d03b5820d75d87bed44fa1fae269a5c6a3a8874913e9ae" Apr 17 17:42:37.541850 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.541824 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff"] Apr 17 17:42:37.542250 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.542234 2546 scope.go:117] "RemoveContainer" containerID="f2f9b0ebd753233c38f6c8459783dc06b31634592a8b2b8be505fdc1978c8e2c" Apr 17 17:42:37.546181 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.546157 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-e5a50-predictor-66d9bf84b6-hkbff"] Apr 17 17:42:37.549874 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.549853 2546 scope.go:117] "RemoveContainer" containerID="63dd93751a316ea1d458ebfc813f9515724d0087e97e97766aa5a9af63b1a442" Apr 17 17:42:37.550146 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:42:37.550126 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63dd93751a316ea1d458ebfc813f9515724d0087e97e97766aa5a9af63b1a442\": container with ID starting with 63dd93751a316ea1d458ebfc813f9515724d0087e97e97766aa5a9af63b1a442 not found: ID does not exist" containerID="63dd93751a316ea1d458ebfc813f9515724d0087e97e97766aa5a9af63b1a442" Apr 17 17:42:37.550214 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.550156 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63dd93751a316ea1d458ebfc813f9515724d0087e97e97766aa5a9af63b1a442"} err="failed to get container status \"63dd93751a316ea1d458ebfc813f9515724d0087e97e97766aa5a9af63b1a442\": rpc error: code = NotFound desc = could not find container \"63dd93751a316ea1d458ebfc813f9515724d0087e97e97766aa5a9af63b1a442\": container with ID starting with 63dd93751a316ea1d458ebfc813f9515724d0087e97e97766aa5a9af63b1a442 not found: ID does not exist" Apr 17 17:42:37.550214 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.550175 2546 scope.go:117] "RemoveContainer" containerID="e2719c614403ea408912588a3fbb417329ef0f79aed402ed00663d5f03d9c0df" Apr 17 17:42:37.550424 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:42:37.550407 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2719c614403ea408912588a3fbb417329ef0f79aed402ed00663d5f03d9c0df\": container with ID starting with e2719c614403ea408912588a3fbb417329ef0f79aed402ed00663d5f03d9c0df not found: ID does not exist" containerID="e2719c614403ea408912588a3fbb417329ef0f79aed402ed00663d5f03d9c0df" Apr 17 17:42:37.550467 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.550429 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2719c614403ea408912588a3fbb417329ef0f79aed402ed00663d5f03d9c0df"} err="failed to get container status \"e2719c614403ea408912588a3fbb417329ef0f79aed402ed00663d5f03d9c0df\": rpc error: code = NotFound desc = could not find container \"e2719c614403ea408912588a3fbb417329ef0f79aed402ed00663d5f03d9c0df\": container with ID starting with e2719c614403ea408912588a3fbb417329ef0f79aed402ed00663d5f03d9c0df not found: ID does not exist" Apr 17 17:42:37.550467 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.550445 2546 scope.go:117] "RemoveContainer" containerID="559efb716d7eaa5ef5d03b5820d75d87bed44fa1fae269a5c6a3a8874913e9ae" Apr 17 17:42:37.550669 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:42:37.550652 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"559efb716d7eaa5ef5d03b5820d75d87bed44fa1fae269a5c6a3a8874913e9ae\": container with ID starting with 559efb716d7eaa5ef5d03b5820d75d87bed44fa1fae269a5c6a3a8874913e9ae not found: ID does not exist" containerID="559efb716d7eaa5ef5d03b5820d75d87bed44fa1fae269a5c6a3a8874913e9ae" Apr 17 17:42:37.550883 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.550700 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"559efb716d7eaa5ef5d03b5820d75d87bed44fa1fae269a5c6a3a8874913e9ae"} err="failed to get container status \"559efb716d7eaa5ef5d03b5820d75d87bed44fa1fae269a5c6a3a8874913e9ae\": rpc error: code = NotFound desc = could not find container \"559efb716d7eaa5ef5d03b5820d75d87bed44fa1fae269a5c6a3a8874913e9ae\": container with ID starting with 559efb716d7eaa5ef5d03b5820d75d87bed44fa1fae269a5c6a3a8874913e9ae not found: ID does not exist" Apr 17 17:42:37.550883 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.550723 2546 scope.go:117] "RemoveContainer" containerID="f2f9b0ebd753233c38f6c8459783dc06b31634592a8b2b8be505fdc1978c8e2c" Apr 17 17:42:37.550988 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:42:37.550949 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2f9b0ebd753233c38f6c8459783dc06b31634592a8b2b8be505fdc1978c8e2c\": container with ID starting with f2f9b0ebd753233c38f6c8459783dc06b31634592a8b2b8be505fdc1978c8e2c not found: ID does not exist" containerID="f2f9b0ebd753233c38f6c8459783dc06b31634592a8b2b8be505fdc1978c8e2c" Apr 17 17:42:37.550988 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.550966 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f9b0ebd753233c38f6c8459783dc06b31634592a8b2b8be505fdc1978c8e2c"} err="failed to get container status \"f2f9b0ebd753233c38f6c8459783dc06b31634592a8b2b8be505fdc1978c8e2c\": rpc error: code = NotFound desc = could not find container \"f2f9b0ebd753233c38f6c8459783dc06b31634592a8b2b8be505fdc1978c8e2c\": container with ID starting with f2f9b0ebd753233c38f6c8459783dc06b31634592a8b2b8be505fdc1978c8e2c not found: ID does not exist" Apr 17 17:42:37.954427 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:37.954391 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" path="/var/lib/kubelet/pods/ed3a4414-d204-47e5-93a3-c2b560b48283/volumes" Apr 17 17:42:39.444159 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:39.444109 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:42:49.443405 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:49.443301 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:42:59.443864 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:42:59.443822 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:43:09.443594 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:43:09.443547 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:43:19.443990 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:43:19.443946 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:43:29.443492 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:43:29.443449 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:43:31.949569 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:43:31.949528 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:43:41.950125 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:43:41.950076 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:43:51.950429 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:43:51.950377 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:44:01.949523 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:01.949477 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:44:11.949543 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:11.949494 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:44:21.949975 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:21.949933 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:44:31.953279 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:31.953250 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:44:37.168178 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.168142 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh"] Apr 17 17:44:37.168641 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.168447 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" containerID="cri-o://f23a87d72cd06fa798ea1380de8b1401b3903d50fcaec5dfdc7bdb7429232973" gracePeriod=30 Apr 17 17:44:37.168641 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.168499 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kube-rbac-proxy" containerID="cri-o://a84cf26e64d666b82999e6b087a31c1b729330c560b75633424b17d1b0598de2" gracePeriod=30 Apr 17 17:44:37.259905 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.259867 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q"] Apr 17 17:44:37.260221 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260210 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7b47acb-62e5-41a1-9006-06239577eae4" containerName="kube-rbac-proxy" Apr 17 17:44:37.260221 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260223 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b47acb-62e5-41a1-9006-06239577eae4" containerName="kube-rbac-proxy" Apr 17 17:44:37.260313 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260238 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="storage-initializer" Apr 17 17:44:37.260313 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260244 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="storage-initializer" Apr 17 17:44:37.260313 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260251 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" Apr 17 17:44:37.260313 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260257 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" Apr 17 17:44:37.260313 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260273 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7b47acb-62e5-41a1-9006-06239577eae4" containerName="kserve-container" Apr 17 17:44:37.260313 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260278 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b47acb-62e5-41a1-9006-06239577eae4" containerName="kserve-container" Apr 17 17:44:37.260313 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260288 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kube-rbac-proxy" Apr 17 17:44:37.260313 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260293 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kube-rbac-proxy" Apr 17 17:44:37.260313 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260300 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="agent" Apr 17 17:44:37.260313 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260306 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="agent" Apr 17 17:44:37.260604 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260355 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kube-rbac-proxy" Apr 17 17:44:37.260604 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260364 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7b47acb-62e5-41a1-9006-06239577eae4" containerName="kserve-container" Apr 17 17:44:37.260604 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260374 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="kserve-container" Apr 17 17:44:37.260604 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260382 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed3a4414-d204-47e5-93a3-c2b560b48283" containerName="agent" Apr 17 17:44:37.260604 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.260388 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7b47acb-62e5-41a1-9006-06239577eae4" containerName="kube-rbac-proxy" Apr 17 17:44:37.263664 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.263646 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:37.265899 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.265872 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-48f392-predictor-serving-cert\"" Apr 17 17:44:37.266023 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.265949 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-48f392-kube-rbac-proxy-sar-config\"" Apr 17 17:44:37.271706 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.271664 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q"] Apr 17 17:44:37.384955 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.384920 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-proxy-tls\") pod \"isvc-primary-48f392-predictor-bccb6d7b8-9qh8q\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:37.385128 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.384967 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5tch\" (UniqueName: \"kubernetes.io/projected/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-kube-api-access-h5tch\") pod \"isvc-primary-48f392-predictor-bccb6d7b8-9qh8q\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:37.385128 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.385064 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-48f392-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-isvc-primary-48f392-kube-rbac-proxy-sar-config\") pod \"isvc-primary-48f392-predictor-bccb6d7b8-9qh8q\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:37.385128 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.385123 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-kserve-provision-location\") pod \"isvc-primary-48f392-predictor-bccb6d7b8-9qh8q\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:37.486444 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.486367 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-48f392-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-isvc-primary-48f392-kube-rbac-proxy-sar-config\") pod \"isvc-primary-48f392-predictor-bccb6d7b8-9qh8q\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:37.486444 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.486421 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-kserve-provision-location\") pod \"isvc-primary-48f392-predictor-bccb6d7b8-9qh8q\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:37.486637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.486458 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-proxy-tls\") pod \"isvc-primary-48f392-predictor-bccb6d7b8-9qh8q\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:37.486637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.486482 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5tch\" (UniqueName: \"kubernetes.io/projected/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-kube-api-access-h5tch\") pod \"isvc-primary-48f392-predictor-bccb6d7b8-9qh8q\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:37.486637 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:44:37.486607 2546 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-primary-48f392-predictor-serving-cert: secret "isvc-primary-48f392-predictor-serving-cert" not found Apr 17 17:44:37.486779 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:44:37.486664 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-proxy-tls podName:c82f1848-d93e-4e3c-b862-eeb4bc7e6f17 nodeName:}" failed. No retries permitted until 2026-04-17 17:44:37.986647627 +0000 UTC m=+1170.608928280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-proxy-tls") pod "isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" (UID: "c82f1848-d93e-4e3c-b862-eeb4bc7e6f17") : secret "isvc-primary-48f392-predictor-serving-cert" not found Apr 17 17:44:37.486909 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.486885 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-kserve-provision-location\") pod \"isvc-primary-48f392-predictor-bccb6d7b8-9qh8q\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:37.487112 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.487096 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-48f392-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-isvc-primary-48f392-kube-rbac-proxy-sar-config\") pod \"isvc-primary-48f392-predictor-bccb6d7b8-9qh8q\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:37.495239 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.495219 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5tch\" (UniqueName: \"kubernetes.io/projected/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-kube-api-access-h5tch\") pod \"isvc-primary-48f392-predictor-bccb6d7b8-9qh8q\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:37.926193 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.926158 2546 generic.go:358] "Generic (PLEG): container finished" podID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerID="a84cf26e64d666b82999e6b087a31c1b729330c560b75633424b17d1b0598de2" exitCode=2 Apr 17 17:44:37.926369 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.926232 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" event={"ID":"3d20f797-5ab8-4774-8a1c-67e9a5f8796b","Type":"ContainerDied","Data":"a84cf26e64d666b82999e6b087a31c1b729330c560b75633424b17d1b0598de2"} Apr 17 17:44:37.991608 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.991578 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-proxy-tls\") pod \"isvc-primary-48f392-predictor-bccb6d7b8-9qh8q\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:37.993997 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:37.993968 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-proxy-tls\") pod \"isvc-primary-48f392-predictor-bccb6d7b8-9qh8q\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:38.175939 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:38.175897 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:38.305191 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:38.305167 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q"] Apr 17 17:44:38.307293 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:44:38.307260 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc82f1848_d93e_4e3c_b862_eeb4bc7e6f17.slice/crio-aaeaf2a4d9ae3254b9a7ec767ccbdadd28fc6480bed6383a8b91edba7ceb225e WatchSource:0}: Error finding container aaeaf2a4d9ae3254b9a7ec767ccbdadd28fc6480bed6383a8b91edba7ceb225e: Status 404 returned error can't find the container with id aaeaf2a4d9ae3254b9a7ec767ccbdadd28fc6480bed6383a8b91edba7ceb225e Apr 17 17:44:38.931258 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:38.931218 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" event={"ID":"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17","Type":"ContainerStarted","Data":"b30763453f183176c84210383718923bc7bea46bc73683f0d67c1d693a2210c8"} Apr 17 17:44:38.931258 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:38.931261 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" event={"ID":"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17","Type":"ContainerStarted","Data":"aaeaf2a4d9ae3254b9a7ec767ccbdadd28fc6480bed6383a8b91edba7ceb225e"} Apr 17 17:44:39.439428 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:39.439382 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.39:8643/healthz\": dial tcp 10.134.0.39:8643: connect: connection refused" Apr 17 17:44:41.950384 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:41.950342 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 17 17:44:42.946191 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:42.946158 2546 generic.go:358] "Generic (PLEG): container finished" podID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerID="b30763453f183176c84210383718923bc7bea46bc73683f0d67c1d693a2210c8" exitCode=0 Apr 17 17:44:42.946367 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:42.946231 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" event={"ID":"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17","Type":"ContainerDied","Data":"b30763453f183176c84210383718923bc7bea46bc73683f0d67c1d693a2210c8"} Apr 17 17:44:43.953575 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:43.953541 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" event={"ID":"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17","Type":"ContainerStarted","Data":"70300dac94f85b79c1c158538d2c692dce3c0862e42c662e5474521052368873"} Apr 17 17:44:43.953575 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:43.953576 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" event={"ID":"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17","Type":"ContainerStarted","Data":"b5c0ff2088ad71d5c31f021aee37b70fbeb08f46c51babc7e11a416b8eeeeeb4"} Apr 17 17:44:43.954011 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:43.953792 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:43.975554 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:43.975488 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" podStartSLOduration=6.975469171 podStartE2EDuration="6.975469171s" podCreationTimestamp="2026-04-17 17:44:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:44:43.973844977 +0000 UTC m=+1176.596125652" watchObservedRunningTime="2026-04-17 17:44:43.975469171 +0000 UTC m=+1176.597749846" Apr 17 17:44:44.439519 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:44.439468 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.39:8643/healthz\": dial tcp 10.134.0.39:8643: connect: connection refused" Apr 17 17:44:44.955773 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:44.955735 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:44.957237 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:44.957208 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 17 17:44:45.958340 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:45.958299 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 17 17:44:46.308364 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.308338 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:44:46.467337 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.467301 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk748\" (UniqueName: \"kubernetes.io/projected/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-kube-api-access-dk748\") pod \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " Apr 17 17:44:46.467502 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.467363 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-proxy-tls\") pod \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " Apr 17 17:44:46.467502 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.467388 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-kserve-provision-location\") pod \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " Apr 17 17:44:46.467584 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.467532 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-scale-raw-2162a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-isvc-sklearn-scale-raw-2162a-kube-rbac-proxy-sar-config\") pod \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\" (UID: \"3d20f797-5ab8-4774-8a1c-67e9a5f8796b\") " Apr 17 17:44:46.467674 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.467652 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3d20f797-5ab8-4774-8a1c-67e9a5f8796b" (UID: "3d20f797-5ab8-4774-8a1c-67e9a5f8796b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:44:46.467898 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.467869 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-isvc-sklearn-scale-raw-2162a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-scale-raw-2162a-kube-rbac-proxy-sar-config") pod "3d20f797-5ab8-4774-8a1c-67e9a5f8796b" (UID: "3d20f797-5ab8-4774-8a1c-67e9a5f8796b"). InnerVolumeSpecName "isvc-sklearn-scale-raw-2162a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:44:46.467898 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.467893 2546 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-kserve-provision-location\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:44:46.469490 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.469454 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-kube-api-access-dk748" (OuterVolumeSpecName: "kube-api-access-dk748") pod "3d20f797-5ab8-4774-8a1c-67e9a5f8796b" (UID: "3d20f797-5ab8-4774-8a1c-67e9a5f8796b"). InnerVolumeSpecName "kube-api-access-dk748". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:44:46.469552 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.469479 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3d20f797-5ab8-4774-8a1c-67e9a5f8796b" (UID: "3d20f797-5ab8-4774-8a1c-67e9a5f8796b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:44:46.569064 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.569032 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dk748\" (UniqueName: \"kubernetes.io/projected/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-kube-api-access-dk748\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:44:46.569064 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.569064 2546 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-proxy-tls\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:44:46.569064 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.569076 2546 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-scale-raw-2162a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3d20f797-5ab8-4774-8a1c-67e9a5f8796b-isvc-sklearn-scale-raw-2162a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:44:46.963133 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.963042 2546 generic.go:358] "Generic (PLEG): container finished" podID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerID="f23a87d72cd06fa798ea1380de8b1401b3903d50fcaec5dfdc7bdb7429232973" exitCode=0 Apr 17 17:44:46.963133 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.963118 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" Apr 17 17:44:46.963575 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.963116 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" event={"ID":"3d20f797-5ab8-4774-8a1c-67e9a5f8796b","Type":"ContainerDied","Data":"f23a87d72cd06fa798ea1380de8b1401b3903d50fcaec5dfdc7bdb7429232973"} Apr 17 17:44:46.963575 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.963217 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh" event={"ID":"3d20f797-5ab8-4774-8a1c-67e9a5f8796b","Type":"ContainerDied","Data":"5ceb543c752c81a0c556b46660960b5d78758e87b01255f6794cca4c5d04e92e"} Apr 17 17:44:46.963575 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.963232 2546 scope.go:117] "RemoveContainer" containerID="a84cf26e64d666b82999e6b087a31c1b729330c560b75633424b17d1b0598de2" Apr 17 17:44:46.971564 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.971542 2546 scope.go:117] "RemoveContainer" containerID="f23a87d72cd06fa798ea1380de8b1401b3903d50fcaec5dfdc7bdb7429232973" Apr 17 17:44:46.978750 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.978731 2546 scope.go:117] "RemoveContainer" containerID="4ba39a90a6164a4e82e10b36e3e0b565d50eb0cd0a3584eb228584b69cc7d8a4" Apr 17 17:44:46.985463 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.985447 2546 scope.go:117] "RemoveContainer" containerID="a84cf26e64d666b82999e6b087a31c1b729330c560b75633424b17d1b0598de2" Apr 17 17:44:46.985736 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:44:46.985714 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a84cf26e64d666b82999e6b087a31c1b729330c560b75633424b17d1b0598de2\": container with ID starting with a84cf26e64d666b82999e6b087a31c1b729330c560b75633424b17d1b0598de2 not found: ID does not exist" containerID="a84cf26e64d666b82999e6b087a31c1b729330c560b75633424b17d1b0598de2" Apr 17 17:44:46.985778 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.985747 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84cf26e64d666b82999e6b087a31c1b729330c560b75633424b17d1b0598de2"} err="failed to get container status \"a84cf26e64d666b82999e6b087a31c1b729330c560b75633424b17d1b0598de2\": rpc error: code = NotFound desc = could not find container \"a84cf26e64d666b82999e6b087a31c1b729330c560b75633424b17d1b0598de2\": container with ID starting with a84cf26e64d666b82999e6b087a31c1b729330c560b75633424b17d1b0598de2 not found: ID does not exist" Apr 17 17:44:46.985778 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.985764 2546 scope.go:117] "RemoveContainer" containerID="f23a87d72cd06fa798ea1380de8b1401b3903d50fcaec5dfdc7bdb7429232973" Apr 17 17:44:46.985992 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:44:46.985975 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f23a87d72cd06fa798ea1380de8b1401b3903d50fcaec5dfdc7bdb7429232973\": container with ID starting with f23a87d72cd06fa798ea1380de8b1401b3903d50fcaec5dfdc7bdb7429232973 not found: ID does not exist" containerID="f23a87d72cd06fa798ea1380de8b1401b3903d50fcaec5dfdc7bdb7429232973" Apr 17 17:44:46.986037 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.985999 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23a87d72cd06fa798ea1380de8b1401b3903d50fcaec5dfdc7bdb7429232973"} err="failed to get container status \"f23a87d72cd06fa798ea1380de8b1401b3903d50fcaec5dfdc7bdb7429232973\": rpc error: code = NotFound desc = could not find container \"f23a87d72cd06fa798ea1380de8b1401b3903d50fcaec5dfdc7bdb7429232973\": container with ID starting with f23a87d72cd06fa798ea1380de8b1401b3903d50fcaec5dfdc7bdb7429232973 not found: ID does not exist" Apr 17 17:44:46.986037 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.986016 2546 scope.go:117] "RemoveContainer" containerID="4ba39a90a6164a4e82e10b36e3e0b565d50eb0cd0a3584eb228584b69cc7d8a4" Apr 17 17:44:46.986237 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:44:46.986219 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ba39a90a6164a4e82e10b36e3e0b565d50eb0cd0a3584eb228584b69cc7d8a4\": container with ID starting with 4ba39a90a6164a4e82e10b36e3e0b565d50eb0cd0a3584eb228584b69cc7d8a4 not found: ID does not exist" containerID="4ba39a90a6164a4e82e10b36e3e0b565d50eb0cd0a3584eb228584b69cc7d8a4" Apr 17 17:44:46.986295 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.986248 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ba39a90a6164a4e82e10b36e3e0b565d50eb0cd0a3584eb228584b69cc7d8a4"} err="failed to get container status \"4ba39a90a6164a4e82e10b36e3e0b565d50eb0cd0a3584eb228584b69cc7d8a4\": rpc error: code = NotFound desc = could not find container \"4ba39a90a6164a4e82e10b36e3e0b565d50eb0cd0a3584eb228584b69cc7d8a4\": container with ID starting with 4ba39a90a6164a4e82e10b36e3e0b565d50eb0cd0a3584eb228584b69cc7d8a4 not found: ID does not exist" Apr 17 17:44:46.994563 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:46.994543 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh"] Apr 17 17:44:47.001399 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:47.001375 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-2162a-predictor-6cdd5f7cff-6g8bh"] Apr 17 17:44:47.953426 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:47.953386 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" path="/var/lib/kubelet/pods/3d20f797-5ab8-4774-8a1c-67e9a5f8796b/volumes" Apr 17 17:44:50.962273 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:50.962239 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:44:50.962781 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:44:50.962755 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 17 17:45:00.963496 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:00.963458 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 17 17:45:07.900164 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:07.900136 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovn-acl-logging/0.log" Apr 17 17:45:07.903112 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:07.903092 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovn-acl-logging/0.log" Apr 17 17:45:10.962765 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:10.962725 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 17 17:45:20.963131 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:20.963091 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 17 17:45:30.963200 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:30.963164 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 17 17:45:40.963564 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:40.963516 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 17 17:45:50.962874 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:50.962796 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:45:57.383199 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.383163 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl"] Apr 17 17:45:57.383593 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.383516 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="storage-initializer" Apr 17 17:45:57.383593 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.383530 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="storage-initializer" Apr 17 17:45:57.383593 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.383544 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" Apr 17 17:45:57.383593 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.383549 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" Apr 17 17:45:57.383593 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.383567 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kube-rbac-proxy" Apr 17 17:45:57.383593 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.383573 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kube-rbac-proxy" Apr 17 17:45:57.383834 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.383624 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kserve-container" Apr 17 17:45:57.383834 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.383635 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d20f797-5ab8-4774-8a1c-67e9a5f8796b" containerName="kube-rbac-proxy" Apr 17 17:45:57.387171 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.387152 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.389599 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.389573 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 17 17:45:57.389767 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.389605 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-48f392-kube-rbac-proxy-sar-config\"" Apr 17 17:45:57.389767 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.389649 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-48f392-dockercfg-x4nzr\"" Apr 17 17:45:57.389767 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.389715 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-48f392-predictor-serving-cert\"" Apr 17 17:45:57.389767 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.389739 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-48f392\"" Apr 17 17:45:57.396079 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.396059 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl"] Apr 17 17:45:57.567884 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.567835 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxlln\" (UniqueName: \"kubernetes.io/projected/3020e928-41dd-41c1-9cbb-3102ca37799a-kube-api-access-cxlln\") pod \"isvc-secondary-48f392-predictor-784cb8f88c-mrsdl\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.567884 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.567878 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3020e928-41dd-41c1-9cbb-3102ca37799a-kserve-provision-location\") pod \"isvc-secondary-48f392-predictor-784cb8f88c-mrsdl\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.568106 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.567978 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3020e928-41dd-41c1-9cbb-3102ca37799a-proxy-tls\") pod \"isvc-secondary-48f392-predictor-784cb8f88c-mrsdl\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.568106 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.568013 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-48f392-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3020e928-41dd-41c1-9cbb-3102ca37799a-isvc-secondary-48f392-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-48f392-predictor-784cb8f88c-mrsdl\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.568106 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.568054 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3020e928-41dd-41c1-9cbb-3102ca37799a-cabundle-cert\") pod \"isvc-secondary-48f392-predictor-784cb8f88c-mrsdl\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.669417 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.669327 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3020e928-41dd-41c1-9cbb-3102ca37799a-cabundle-cert\") pod \"isvc-secondary-48f392-predictor-784cb8f88c-mrsdl\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.669417 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.669385 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxlln\" (UniqueName: \"kubernetes.io/projected/3020e928-41dd-41c1-9cbb-3102ca37799a-kube-api-access-cxlln\") pod \"isvc-secondary-48f392-predictor-784cb8f88c-mrsdl\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.669417 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.669411 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3020e928-41dd-41c1-9cbb-3102ca37799a-kserve-provision-location\") pod \"isvc-secondary-48f392-predictor-784cb8f88c-mrsdl\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.669719 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.669456 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3020e928-41dd-41c1-9cbb-3102ca37799a-proxy-tls\") pod \"isvc-secondary-48f392-predictor-784cb8f88c-mrsdl\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.669719 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.669480 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-48f392-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3020e928-41dd-41c1-9cbb-3102ca37799a-isvc-secondary-48f392-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-48f392-predictor-784cb8f88c-mrsdl\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.669851 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.669833 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3020e928-41dd-41c1-9cbb-3102ca37799a-kserve-provision-location\") pod \"isvc-secondary-48f392-predictor-784cb8f88c-mrsdl\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.670111 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.670091 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3020e928-41dd-41c1-9cbb-3102ca37799a-cabundle-cert\") pod \"isvc-secondary-48f392-predictor-784cb8f88c-mrsdl\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.670187 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.670164 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-48f392-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3020e928-41dd-41c1-9cbb-3102ca37799a-isvc-secondary-48f392-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-48f392-predictor-784cb8f88c-mrsdl\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.671848 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.671831 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3020e928-41dd-41c1-9cbb-3102ca37799a-proxy-tls\") pod \"isvc-secondary-48f392-predictor-784cb8f88c-mrsdl\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.677901 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.677875 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxlln\" (UniqueName: \"kubernetes.io/projected/3020e928-41dd-41c1-9cbb-3102ca37799a-kube-api-access-cxlln\") pod \"isvc-secondary-48f392-predictor-784cb8f88c-mrsdl\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.699494 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.699464 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:45:57.825030 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.825003 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl"] Apr 17 17:45:57.827450 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:45:57.827419 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3020e928_41dd_41c1_9cbb_3102ca37799a.slice/crio-5b77a56771d004dfa197a61e46c62f9b53bd4436b72157051c255bab22261fe5 WatchSource:0}: Error finding container 5b77a56771d004dfa197a61e46c62f9b53bd4436b72157051c255bab22261fe5: Status 404 returned error can't find the container with id 5b77a56771d004dfa197a61e46c62f9b53bd4436b72157051c255bab22261fe5 Apr 17 17:45:57.829766 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:57.829746 2546 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:45:58.190498 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:58.190457 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" event={"ID":"3020e928-41dd-41c1-9cbb-3102ca37799a","Type":"ContainerStarted","Data":"001ce0254c6b0942b348a56cdc38d088ff17166a58acf504499ddb1fb4ed0d8e"} Apr 17 17:45:58.190498 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:45:58.190495 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" event={"ID":"3020e928-41dd-41c1-9cbb-3102ca37799a","Type":"ContainerStarted","Data":"5b77a56771d004dfa197a61e46c62f9b53bd4436b72157051c255bab22261fe5"} Apr 17 17:46:01.201705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:01.201602 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-48f392-predictor-784cb8f88c-mrsdl_3020e928-41dd-41c1-9cbb-3102ca37799a/storage-initializer/0.log" Apr 17 17:46:01.201705 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:01.201644 2546 generic.go:358] "Generic (PLEG): container finished" podID="3020e928-41dd-41c1-9cbb-3102ca37799a" containerID="001ce0254c6b0942b348a56cdc38d088ff17166a58acf504499ddb1fb4ed0d8e" exitCode=1 Apr 17 17:46:01.202125 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:01.201728 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" event={"ID":"3020e928-41dd-41c1-9cbb-3102ca37799a","Type":"ContainerDied","Data":"001ce0254c6b0942b348a56cdc38d088ff17166a58acf504499ddb1fb4ed0d8e"} Apr 17 17:46:02.207295 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:02.207264 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-48f392-predictor-784cb8f88c-mrsdl_3020e928-41dd-41c1-9cbb-3102ca37799a/storage-initializer/0.log" Apr 17 17:46:02.207716 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:02.207307 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" event={"ID":"3020e928-41dd-41c1-9cbb-3102ca37799a","Type":"ContainerStarted","Data":"7ae51ba2b74f45a80d7e19798e9bc7756b6f5b7cdec97c5094fc16962e06f265"} Apr 17 17:46:06.221810 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:06.221781 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-48f392-predictor-784cb8f88c-mrsdl_3020e928-41dd-41c1-9cbb-3102ca37799a/storage-initializer/1.log" Apr 17 17:46:06.222193 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:06.222156 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-48f392-predictor-784cb8f88c-mrsdl_3020e928-41dd-41c1-9cbb-3102ca37799a/storage-initializer/0.log" Apr 17 17:46:06.222234 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:06.222189 2546 generic.go:358] "Generic (PLEG): container finished" podID="3020e928-41dd-41c1-9cbb-3102ca37799a" containerID="7ae51ba2b74f45a80d7e19798e9bc7756b6f5b7cdec97c5094fc16962e06f265" exitCode=1 Apr 17 17:46:06.222280 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:06.222261 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" event={"ID":"3020e928-41dd-41c1-9cbb-3102ca37799a","Type":"ContainerDied","Data":"7ae51ba2b74f45a80d7e19798e9bc7756b6f5b7cdec97c5094fc16962e06f265"} Apr 17 17:46:06.222316 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:06.222302 2546 scope.go:117] "RemoveContainer" containerID="001ce0254c6b0942b348a56cdc38d088ff17166a58acf504499ddb1fb4ed0d8e" Apr 17 17:46:06.222722 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:06.222697 2546 scope.go:117] "RemoveContainer" containerID="001ce0254c6b0942b348a56cdc38d088ff17166a58acf504499ddb1fb4ed0d8e" Apr 17 17:46:06.232829 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:46:06.232799 2546 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-48f392-predictor-784cb8f88c-mrsdl_kserve-ci-e2e-test_3020e928-41dd-41c1-9cbb-3102ca37799a_0 in pod sandbox 5b77a56771d004dfa197a61e46c62f9b53bd4436b72157051c255bab22261fe5 from index: no such id: '001ce0254c6b0942b348a56cdc38d088ff17166a58acf504499ddb1fb4ed0d8e'" containerID="001ce0254c6b0942b348a56cdc38d088ff17166a58acf504499ddb1fb4ed0d8e" Apr 17 17:46:06.232921 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:06.232838 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"001ce0254c6b0942b348a56cdc38d088ff17166a58acf504499ddb1fb4ed0d8e"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-48f392-predictor-784cb8f88c-mrsdl_kserve-ci-e2e-test_3020e928-41dd-41c1-9cbb-3102ca37799a_0 in pod sandbox 5b77a56771d004dfa197a61e46c62f9b53bd4436b72157051c255bab22261fe5 from index: no such id: '001ce0254c6b0942b348a56cdc38d088ff17166a58acf504499ddb1fb4ed0d8e'" Apr 17 17:46:06.233101 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:46:06.233079 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-48f392-predictor-784cb8f88c-mrsdl_kserve-ci-e2e-test(3020e928-41dd-41c1-9cbb-3102ca37799a)\"" pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" podUID="3020e928-41dd-41c1-9cbb-3102ca37799a" Apr 17 17:46:07.227901 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:07.227876 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-48f392-predictor-784cb8f88c-mrsdl_3020e928-41dd-41c1-9cbb-3102ca37799a/storage-initializer/1.log" Apr 17 17:46:13.439343 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.439310 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl"] Apr 17 17:46:13.511971 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.511942 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q"] Apr 17 17:46:13.512394 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.512294 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kserve-container" containerID="cri-o://b5c0ff2088ad71d5c31f021aee37b70fbeb08f46c51babc7e11a416b8eeeeeb4" gracePeriod=30 Apr 17 17:46:13.512394 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.512372 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kube-rbac-proxy" containerID="cri-o://70300dac94f85b79c1c158538d2c692dce3c0862e42c662e5474521052368873" gracePeriod=30 Apr 17 17:46:13.592352 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.592322 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7"] Apr 17 17:46:13.600050 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.600022 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:13.602704 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.602647 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-f4bb2d-kube-rbac-proxy-sar-config\"" Apr 17 17:46:13.603193 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.602780 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-f4bb2d-predictor-serving-cert\"" Apr 17 17:46:13.603193 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.602848 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-f4bb2d\"" Apr 17 17:46:13.603193 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.603095 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-f4bb2d-dockercfg-qgncf\"" Apr 17 17:46:13.605663 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.605641 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7"] Apr 17 17:46:13.635009 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.634990 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-48f392-predictor-784cb8f88c-mrsdl_3020e928-41dd-41c1-9cbb-3102ca37799a/storage-initializer/1.log" Apr 17 17:46:13.635104 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.635050 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:46:13.690581 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.690494 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-48f392-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3020e928-41dd-41c1-9cbb-3102ca37799a-isvc-secondary-48f392-kube-rbac-proxy-sar-config\") pod \"3020e928-41dd-41c1-9cbb-3102ca37799a\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " Apr 17 17:46:13.690581 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.690544 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3020e928-41dd-41c1-9cbb-3102ca37799a-cabundle-cert\") pod \"3020e928-41dd-41c1-9cbb-3102ca37799a\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " Apr 17 17:46:13.690581 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.690565 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3020e928-41dd-41c1-9cbb-3102ca37799a-proxy-tls\") pod \"3020e928-41dd-41c1-9cbb-3102ca37799a\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " Apr 17 17:46:13.690890 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.690641 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-proxy-tls\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:13.690890 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.690706 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-f4bb2d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-isvc-init-fail-f4bb2d-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:13.690890 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.690742 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-cabundle-cert\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:13.690890 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.690834 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsw7v\" (UniqueName: \"kubernetes.io/projected/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-kube-api-access-gsw7v\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:13.690890 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.690857 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-kserve-provision-location\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:13.691144 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.690908 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3020e928-41dd-41c1-9cbb-3102ca37799a-isvc-secondary-48f392-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-48f392-kube-rbac-proxy-sar-config") pod "3020e928-41dd-41c1-9cbb-3102ca37799a" (UID: "3020e928-41dd-41c1-9cbb-3102ca37799a"). InnerVolumeSpecName "isvc-secondary-48f392-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:46:13.691144 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.690977 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3020e928-41dd-41c1-9cbb-3102ca37799a-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "3020e928-41dd-41c1-9cbb-3102ca37799a" (UID: "3020e928-41dd-41c1-9cbb-3102ca37799a"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:46:13.692663 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.692641 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3020e928-41dd-41c1-9cbb-3102ca37799a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3020e928-41dd-41c1-9cbb-3102ca37799a" (UID: "3020e928-41dd-41c1-9cbb-3102ca37799a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:46:13.791465 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.791433 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3020e928-41dd-41c1-9cbb-3102ca37799a-kserve-provision-location\") pod \"3020e928-41dd-41c1-9cbb-3102ca37799a\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " Apr 17 17:46:13.791465 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.791473 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxlln\" (UniqueName: \"kubernetes.io/projected/3020e928-41dd-41c1-9cbb-3102ca37799a-kube-api-access-cxlln\") pod \"3020e928-41dd-41c1-9cbb-3102ca37799a\" (UID: \"3020e928-41dd-41c1-9cbb-3102ca37799a\") " Apr 17 17:46:13.791727 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.791621 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsw7v\" (UniqueName: \"kubernetes.io/projected/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-kube-api-access-gsw7v\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:13.791727 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.791655 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-kserve-provision-location\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:13.791839 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.791738 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-proxy-tls\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:13.791839 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.791745 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3020e928-41dd-41c1-9cbb-3102ca37799a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3020e928-41dd-41c1-9cbb-3102ca37799a" (UID: "3020e928-41dd-41c1-9cbb-3102ca37799a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:46:13.791839 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.791772 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-f4bb2d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-isvc-init-fail-f4bb2d-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:13.791839 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.791811 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-cabundle-cert\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:13.792084 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.791869 2546 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-48f392-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3020e928-41dd-41c1-9cbb-3102ca37799a-isvc-secondary-48f392-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:46:13.792084 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.791887 2546 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3020e928-41dd-41c1-9cbb-3102ca37799a-cabundle-cert\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:46:13.792084 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.791901 2546 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3020e928-41dd-41c1-9cbb-3102ca37799a-proxy-tls\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:46:13.792084 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.791913 2546 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3020e928-41dd-41c1-9cbb-3102ca37799a-kserve-provision-location\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:46:13.792084 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:46:13.791916 2546 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-serving-cert: secret "isvc-init-fail-f4bb2d-predictor-serving-cert" not found Apr 17 17:46:13.792084 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:46:13.791993 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-proxy-tls podName:5b824ed2-0ab5-48d9-97ed-324eb2d248b3 nodeName:}" failed. No retries permitted until 2026-04-17 17:46:14.291971477 +0000 UTC m=+1266.914252141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-proxy-tls") pod "isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" (UID: "5b824ed2-0ab5-48d9-97ed-324eb2d248b3") : secret "isvc-init-fail-f4bb2d-predictor-serving-cert" not found Apr 17 17:46:13.792290 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.792108 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-kserve-provision-location\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:13.792537 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.792512 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-f4bb2d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-isvc-init-fail-f4bb2d-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:13.792583 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.792563 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-cabundle-cert\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:13.793647 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.793624 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3020e928-41dd-41c1-9cbb-3102ca37799a-kube-api-access-cxlln" (OuterVolumeSpecName: "kube-api-access-cxlln") pod "3020e928-41dd-41c1-9cbb-3102ca37799a" (UID: "3020e928-41dd-41c1-9cbb-3102ca37799a"). InnerVolumeSpecName "kube-api-access-cxlln". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:46:13.801065 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.801039 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsw7v\" (UniqueName: \"kubernetes.io/projected/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-kube-api-access-gsw7v\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:13.892847 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:13.892811 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cxlln\" (UniqueName: \"kubernetes.io/projected/3020e928-41dd-41c1-9cbb-3102ca37799a-kube-api-access-cxlln\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:46:14.256162 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:14.256064 2546 generic.go:358] "Generic (PLEG): container finished" podID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerID="70300dac94f85b79c1c158538d2c692dce3c0862e42c662e5474521052368873" exitCode=2 Apr 17 17:46:14.256162 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:14.256141 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" event={"ID":"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17","Type":"ContainerDied","Data":"70300dac94f85b79c1c158538d2c692dce3c0862e42c662e5474521052368873"} Apr 17 17:46:14.257222 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:14.257205 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-48f392-predictor-784cb8f88c-mrsdl_3020e928-41dd-41c1-9cbb-3102ca37799a/storage-initializer/1.log" Apr 17 17:46:14.257327 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:14.257298 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" event={"ID":"3020e928-41dd-41c1-9cbb-3102ca37799a","Type":"ContainerDied","Data":"5b77a56771d004dfa197a61e46c62f9b53bd4436b72157051c255bab22261fe5"} Apr 17 17:46:14.257327 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:14.257320 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl" Apr 17 17:46:14.257396 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:14.257332 2546 scope.go:117] "RemoveContainer" containerID="7ae51ba2b74f45a80d7e19798e9bc7756b6f5b7cdec97c5094fc16962e06f265" Apr 17 17:46:14.289184 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:14.289154 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl"] Apr 17 17:46:14.292473 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:14.292442 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-48f392-predictor-784cb8f88c-mrsdl"] Apr 17 17:46:14.295038 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:14.295019 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-proxy-tls\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:14.297367 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:14.297350 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-proxy-tls\") pod \"isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:14.512237 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:14.512149 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:14.636024 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:14.635988 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7"] Apr 17 17:46:14.639469 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:46:14.639438 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b824ed2_0ab5_48d9_97ed_324eb2d248b3.slice/crio-e6777322dba7c02b19720760ab074ea8de4ac1f9c6be29fafcd375c5bbdb5a4e WatchSource:0}: Error finding container e6777322dba7c02b19720760ab074ea8de4ac1f9c6be29fafcd375c5bbdb5a4e: Status 404 returned error can't find the container with id e6777322dba7c02b19720760ab074ea8de4ac1f9c6be29fafcd375c5bbdb5a4e Apr 17 17:46:15.262757 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:15.262719 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" event={"ID":"5b824ed2-0ab5-48d9-97ed-324eb2d248b3","Type":"ContainerStarted","Data":"b2cff7f7d155f77ed4d6ae522de9c5869d6f2c5f752ecf3f6afd75081ae9f0c8"} Apr 17 17:46:15.262757 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:15.262757 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" event={"ID":"5b824ed2-0ab5-48d9-97ed-324eb2d248b3","Type":"ContainerStarted","Data":"e6777322dba7c02b19720760ab074ea8de4ac1f9c6be29fafcd375c5bbdb5a4e"} Apr 17 17:46:15.954128 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:15.954087 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3020e928-41dd-41c1-9cbb-3102ca37799a" path="/var/lib/kubelet/pods/3020e928-41dd-41c1-9cbb-3102ca37799a/volumes" Apr 17 17:46:15.958595 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:15.958569 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.40:8643/healthz\": dial tcp 10.134.0.40:8643: connect: connection refused" Apr 17 17:46:17.959231 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:17.959203 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:46:18.021559 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.021478 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-proxy-tls\") pod \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " Apr 17 17:46:18.021703 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.021568 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-kserve-provision-location\") pod \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " Apr 17 17:46:18.021703 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.021610 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5tch\" (UniqueName: \"kubernetes.io/projected/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-kube-api-access-h5tch\") pod \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " Apr 17 17:46:18.021810 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.021787 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-48f392-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-isvc-primary-48f392-kube-rbac-proxy-sar-config\") pod \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\" (UID: \"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17\") " Apr 17 17:46:18.021906 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.021870 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" (UID: "c82f1848-d93e-4e3c-b862-eeb4bc7e6f17"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:46:18.022114 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.022096 2546 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-kserve-provision-location\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:46:18.022175 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.022120 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-isvc-primary-48f392-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-48f392-kube-rbac-proxy-sar-config") pod "c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" (UID: "c82f1848-d93e-4e3c-b862-eeb4bc7e6f17"). InnerVolumeSpecName "isvc-primary-48f392-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:46:18.023542 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.023517 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" (UID: "c82f1848-d93e-4e3c-b862-eeb4bc7e6f17"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:46:18.024075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.024057 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-kube-api-access-h5tch" (OuterVolumeSpecName: "kube-api-access-h5tch") pod "c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" (UID: "c82f1848-d93e-4e3c-b862-eeb4bc7e6f17"). InnerVolumeSpecName "kube-api-access-h5tch". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:46:18.122991 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.122961 2546 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-48f392-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-isvc-primary-48f392-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:46:18.122991 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.122986 2546 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-proxy-tls\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:46:18.122991 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.122997 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h5tch\" (UniqueName: \"kubernetes.io/projected/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17-kube-api-access-h5tch\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:46:18.273695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.273595 2546 generic.go:358] "Generic (PLEG): container finished" podID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerID="b5c0ff2088ad71d5c31f021aee37b70fbeb08f46c51babc7e11a416b8eeeeeb4" exitCode=0 Apr 17 17:46:18.273695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.273662 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" event={"ID":"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17","Type":"ContainerDied","Data":"b5c0ff2088ad71d5c31f021aee37b70fbeb08f46c51babc7e11a416b8eeeeeb4"} Apr 17 17:46:18.273695 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.273669 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" Apr 17 17:46:18.273947 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.273714 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q" event={"ID":"c82f1848-d93e-4e3c-b862-eeb4bc7e6f17","Type":"ContainerDied","Data":"aaeaf2a4d9ae3254b9a7ec767ccbdadd28fc6480bed6383a8b91edba7ceb225e"} Apr 17 17:46:18.273947 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.273731 2546 scope.go:117] "RemoveContainer" containerID="70300dac94f85b79c1c158538d2c692dce3c0862e42c662e5474521052368873" Apr 17 17:46:18.282451 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.282432 2546 scope.go:117] "RemoveContainer" containerID="b5c0ff2088ad71d5c31f021aee37b70fbeb08f46c51babc7e11a416b8eeeeeb4" Apr 17 17:46:18.289492 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.289470 2546 scope.go:117] "RemoveContainer" containerID="b30763453f183176c84210383718923bc7bea46bc73683f0d67c1d693a2210c8" Apr 17 17:46:18.295125 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.295093 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q"] Apr 17 17:46:18.297398 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.297346 2546 scope.go:117] "RemoveContainer" containerID="70300dac94f85b79c1c158538d2c692dce3c0862e42c662e5474521052368873" Apr 17 17:46:18.297813 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:46:18.297786 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70300dac94f85b79c1c158538d2c692dce3c0862e42c662e5474521052368873\": container with ID starting with 70300dac94f85b79c1c158538d2c692dce3c0862e42c662e5474521052368873 not found: ID does not exist" containerID="70300dac94f85b79c1c158538d2c692dce3c0862e42c662e5474521052368873" Apr 17 17:46:18.297903 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.297841 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70300dac94f85b79c1c158538d2c692dce3c0862e42c662e5474521052368873"} err="failed to get container status \"70300dac94f85b79c1c158538d2c692dce3c0862e42c662e5474521052368873\": rpc error: code = NotFound desc = could not find container \"70300dac94f85b79c1c158538d2c692dce3c0862e42c662e5474521052368873\": container with ID starting with 70300dac94f85b79c1c158538d2c692dce3c0862e42c662e5474521052368873 not found: ID does not exist" Apr 17 17:46:18.297903 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.297868 2546 scope.go:117] "RemoveContainer" containerID="b5c0ff2088ad71d5c31f021aee37b70fbeb08f46c51babc7e11a416b8eeeeeb4" Apr 17 17:46:18.298191 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:46:18.298166 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5c0ff2088ad71d5c31f021aee37b70fbeb08f46c51babc7e11a416b8eeeeeb4\": container with ID starting with b5c0ff2088ad71d5c31f021aee37b70fbeb08f46c51babc7e11a416b8eeeeeb4 not found: ID does not exist" containerID="b5c0ff2088ad71d5c31f021aee37b70fbeb08f46c51babc7e11a416b8eeeeeb4" Apr 17 17:46:18.298255 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.298197 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c0ff2088ad71d5c31f021aee37b70fbeb08f46c51babc7e11a416b8eeeeeb4"} err="failed to get container status \"b5c0ff2088ad71d5c31f021aee37b70fbeb08f46c51babc7e11a416b8eeeeeb4\": rpc error: code = NotFound desc = could not find container \"b5c0ff2088ad71d5c31f021aee37b70fbeb08f46c51babc7e11a416b8eeeeeb4\": container with ID starting with b5c0ff2088ad71d5c31f021aee37b70fbeb08f46c51babc7e11a416b8eeeeeb4 not found: ID does not exist" Apr 17 17:46:18.298255 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.298215 2546 scope.go:117] "RemoveContainer" containerID="b30763453f183176c84210383718923bc7bea46bc73683f0d67c1d693a2210c8" Apr 17 17:46:18.298455 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:46:18.298437 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b30763453f183176c84210383718923bc7bea46bc73683f0d67c1d693a2210c8\": container with ID starting with b30763453f183176c84210383718923bc7bea46bc73683f0d67c1d693a2210c8 not found: ID does not exist" containerID="b30763453f183176c84210383718923bc7bea46bc73683f0d67c1d693a2210c8" Apr 17 17:46:18.298514 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.298459 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b30763453f183176c84210383718923bc7bea46bc73683f0d67c1d693a2210c8"} err="failed to get container status \"b30763453f183176c84210383718923bc7bea46bc73683f0d67c1d693a2210c8\": rpc error: code = NotFound desc = could not find container \"b30763453f183176c84210383718923bc7bea46bc73683f0d67c1d693a2210c8\": container with ID starting with b30763453f183176c84210383718923bc7bea46bc73683f0d67c1d693a2210c8 not found: ID does not exist" Apr 17 17:46:18.299581 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:18.299557 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-48f392-predictor-bccb6d7b8-9qh8q"] Apr 17 17:46:19.953693 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:19.953651 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" path="/var/lib/kubelet/pods/c82f1848-d93e-4e3c-b862-eeb4bc7e6f17/volumes" Apr 17 17:46:20.282383 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:20.282305 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7_5b824ed2-0ab5-48d9-97ed-324eb2d248b3/storage-initializer/0.log" Apr 17 17:46:20.282383 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:20.282341 2546 generic.go:358] "Generic (PLEG): container finished" podID="5b824ed2-0ab5-48d9-97ed-324eb2d248b3" containerID="b2cff7f7d155f77ed4d6ae522de9c5869d6f2c5f752ecf3f6afd75081ae9f0c8" exitCode=1 Apr 17 17:46:20.282553 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:20.282422 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" event={"ID":"5b824ed2-0ab5-48d9-97ed-324eb2d248b3","Type":"ContainerDied","Data":"b2cff7f7d155f77ed4d6ae522de9c5869d6f2c5f752ecf3f6afd75081ae9f0c8"} Apr 17 17:46:21.288399 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:21.288376 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7_5b824ed2-0ab5-48d9-97ed-324eb2d248b3/storage-initializer/0.log" Apr 17 17:46:21.288867 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:21.288442 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" event={"ID":"5b824ed2-0ab5-48d9-97ed-324eb2d248b3","Type":"ContainerStarted","Data":"69195c38452664497c58d83ef89dfb71c364ad77833c061989f61a599532b13c"} Apr 17 17:46:23.631313 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.631273 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7"] Apr 17 17:46:23.631768 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.631609 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" podUID="5b824ed2-0ab5-48d9-97ed-324eb2d248b3" containerName="storage-initializer" containerID="cri-o://69195c38452664497c58d83ef89dfb71c364ad77833c061989f61a599532b13c" gracePeriod=30 Apr 17 17:46:23.740431 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.740354 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j"] Apr 17 17:46:23.740733 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.740720 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3020e928-41dd-41c1-9cbb-3102ca37799a" containerName="storage-initializer" Apr 17 17:46:23.740786 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.740735 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3020e928-41dd-41c1-9cbb-3102ca37799a" containerName="storage-initializer" Apr 17 17:46:23.740786 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.740750 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="storage-initializer" Apr 17 17:46:23.740786 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.740755 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="storage-initializer" Apr 17 17:46:23.740786 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.740766 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kserve-container" Apr 17 17:46:23.740786 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.740772 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kserve-container" Apr 17 17:46:23.740786 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.740782 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3020e928-41dd-41c1-9cbb-3102ca37799a" containerName="storage-initializer" Apr 17 17:46:23.740786 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.740787 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3020e928-41dd-41c1-9cbb-3102ca37799a" containerName="storage-initializer" Apr 17 17:46:23.740991 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.740794 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kube-rbac-proxy" Apr 17 17:46:23.740991 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.740799 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kube-rbac-proxy" Apr 17 17:46:23.740991 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.740858 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="3020e928-41dd-41c1-9cbb-3102ca37799a" containerName="storage-initializer" Apr 17 17:46:23.740991 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.740866 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kube-rbac-proxy" Apr 17 17:46:23.740991 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.740875 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="c82f1848-d93e-4e3c-b862-eeb4bc7e6f17" containerName="kserve-container" Apr 17 17:46:23.740991 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.740882 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="3020e928-41dd-41c1-9cbb-3102ca37799a" containerName="storage-initializer" Apr 17 17:46:23.744467 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.744439 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:23.746762 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.746703 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-b1f8f-kube-rbac-proxy-sar-config\"" Apr 17 17:46:23.746762 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.746751 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-b1f8f-predictor-serving-cert\"" Apr 17 17:46:23.746996 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.746799 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gt7v2\"" Apr 17 17:46:23.754199 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.754176 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j"] Apr 17 17:46:23.773254 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.773231 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-b1f8f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/495db916-37c3-4257-ac7e-7d86b6638fcc-raw-sklearn-b1f8f-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:23.773372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.773291 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzbsk\" (UniqueName: \"kubernetes.io/projected/495db916-37c3-4257-ac7e-7d86b6638fcc-kube-api-access-qzbsk\") pod \"raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:23.773372 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.773320 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/495db916-37c3-4257-ac7e-7d86b6638fcc-kserve-provision-location\") pod \"raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:23.773488 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.773378 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/495db916-37c3-4257-ac7e-7d86b6638fcc-proxy-tls\") pod \"raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:23.773990 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.773975 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7_5b824ed2-0ab5-48d9-97ed-324eb2d248b3/storage-initializer/1.log" Apr 17 17:46:23.774376 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.774356 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7_5b824ed2-0ab5-48d9-97ed-324eb2d248b3/storage-initializer/0.log" Apr 17 17:46:23.774458 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.774434 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:23.873820 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.873786 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-proxy-tls\") pod \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " Apr 17 17:46:23.873967 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.873829 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-kserve-provision-location\") pod \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " Apr 17 17:46:23.873967 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.873894 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-cabundle-cert\") pod \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " Apr 17 17:46:23.873967 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.873912 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-f4bb2d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-isvc-init-fail-f4bb2d-kube-rbac-proxy-sar-config\") pod \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " Apr 17 17:46:23.873967 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.873929 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsw7v\" (UniqueName: \"kubernetes.io/projected/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-kube-api-access-gsw7v\") pod \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\" (UID: \"5b824ed2-0ab5-48d9-97ed-324eb2d248b3\") " Apr 17 17:46:23.874159 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.874085 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-b1f8f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/495db916-37c3-4257-ac7e-7d86b6638fcc-raw-sklearn-b1f8f-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:23.874159 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.874127 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzbsk\" (UniqueName: \"kubernetes.io/projected/495db916-37c3-4257-ac7e-7d86b6638fcc-kube-api-access-qzbsk\") pod \"raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:23.874159 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.874155 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/495db916-37c3-4257-ac7e-7d86b6638fcc-kserve-provision-location\") pod \"raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:23.874316 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.874201 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/495db916-37c3-4257-ac7e-7d86b6638fcc-proxy-tls\") pod \"raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:23.874316 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.874200 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5b824ed2-0ab5-48d9-97ed-324eb2d248b3" (UID: "5b824ed2-0ab5-48d9-97ed-324eb2d248b3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:46:23.874316 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.874274 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "5b824ed2-0ab5-48d9-97ed-324eb2d248b3" (UID: "5b824ed2-0ab5-48d9-97ed-324eb2d248b3"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:46:23.874316 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.874301 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-isvc-init-fail-f4bb2d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-f4bb2d-kube-rbac-proxy-sar-config") pod "5b824ed2-0ab5-48d9-97ed-324eb2d248b3" (UID: "5b824ed2-0ab5-48d9-97ed-324eb2d248b3"). InnerVolumeSpecName "isvc-init-fail-f4bb2d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:46:23.874529 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.874370 2546 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-cabundle-cert\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:46:23.874529 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.874390 2546 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-f4bb2d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-isvc-init-fail-f4bb2d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:46:23.874529 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:46:23.874395 2546 secret.go:189] Couldn't get secret kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-serving-cert: secret "raw-sklearn-b1f8f-predictor-serving-cert" not found Apr 17 17:46:23.874529 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.874405 2546 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-kserve-provision-location\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:46:23.874529 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:46:23.874452 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495db916-37c3-4257-ac7e-7d86b6638fcc-proxy-tls podName:495db916-37c3-4257-ac7e-7d86b6638fcc nodeName:}" failed. No retries permitted until 2026-04-17 17:46:24.374431858 +0000 UTC m=+1276.996712523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/495db916-37c3-4257-ac7e-7d86b6638fcc-proxy-tls") pod "raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" (UID: "495db916-37c3-4257-ac7e-7d86b6638fcc") : secret "raw-sklearn-b1f8f-predictor-serving-cert" not found Apr 17 17:46:23.874824 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.874592 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/495db916-37c3-4257-ac7e-7d86b6638fcc-kserve-provision-location\") pod \"raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:23.874910 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.874883 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-b1f8f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/495db916-37c3-4257-ac7e-7d86b6638fcc-raw-sklearn-b1f8f-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:23.876217 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.876198 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-kube-api-access-gsw7v" (OuterVolumeSpecName: "kube-api-access-gsw7v") pod "5b824ed2-0ab5-48d9-97ed-324eb2d248b3" (UID: "5b824ed2-0ab5-48d9-97ed-324eb2d248b3"). InnerVolumeSpecName "kube-api-access-gsw7v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:46:23.876273 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.876223 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5b824ed2-0ab5-48d9-97ed-324eb2d248b3" (UID: "5b824ed2-0ab5-48d9-97ed-324eb2d248b3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:46:23.883000 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.882979 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzbsk\" (UniqueName: \"kubernetes.io/projected/495db916-37c3-4257-ac7e-7d86b6638fcc-kube-api-access-qzbsk\") pod \"raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:23.974903 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.974875 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gsw7v\" (UniqueName: \"kubernetes.io/projected/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-kube-api-access-gsw7v\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:46:23.974903 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:23.974900 2546 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b824ed2-0ab5-48d9-97ed-324eb2d248b3-proxy-tls\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:46:24.299349 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.299322 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7_5b824ed2-0ab5-48d9-97ed-324eb2d248b3/storage-initializer/1.log" Apr 17 17:46:24.299703 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.299667 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7_5b824ed2-0ab5-48d9-97ed-324eb2d248b3/storage-initializer/0.log" Apr 17 17:46:24.299767 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.299724 2546 generic.go:358] "Generic (PLEG): container finished" podID="5b824ed2-0ab5-48d9-97ed-324eb2d248b3" containerID="69195c38452664497c58d83ef89dfb71c364ad77833c061989f61a599532b13c" exitCode=1 Apr 17 17:46:24.299804 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.299757 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" event={"ID":"5b824ed2-0ab5-48d9-97ed-324eb2d248b3","Type":"ContainerDied","Data":"69195c38452664497c58d83ef89dfb71c364ad77833c061989f61a599532b13c"} Apr 17 17:46:24.299804 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.299793 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" event={"ID":"5b824ed2-0ab5-48d9-97ed-324eb2d248b3","Type":"ContainerDied","Data":"e6777322dba7c02b19720760ab074ea8de4ac1f9c6be29fafcd375c5bbdb5a4e"} Apr 17 17:46:24.299877 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.299810 2546 scope.go:117] "RemoveContainer" containerID="69195c38452664497c58d83ef89dfb71c364ad77833c061989f61a599532b13c" Apr 17 17:46:24.299877 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.299834 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7" Apr 17 17:46:24.310644 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.310620 2546 scope.go:117] "RemoveContainer" containerID="b2cff7f7d155f77ed4d6ae522de9c5869d6f2c5f752ecf3f6afd75081ae9f0c8" Apr 17 17:46:24.318168 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.318147 2546 scope.go:117] "RemoveContainer" containerID="69195c38452664497c58d83ef89dfb71c364ad77833c061989f61a599532b13c" Apr 17 17:46:24.318439 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:46:24.318419 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69195c38452664497c58d83ef89dfb71c364ad77833c061989f61a599532b13c\": container with ID starting with 69195c38452664497c58d83ef89dfb71c364ad77833c061989f61a599532b13c not found: ID does not exist" containerID="69195c38452664497c58d83ef89dfb71c364ad77833c061989f61a599532b13c" Apr 17 17:46:24.318482 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.318449 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69195c38452664497c58d83ef89dfb71c364ad77833c061989f61a599532b13c"} err="failed to get container status \"69195c38452664497c58d83ef89dfb71c364ad77833c061989f61a599532b13c\": rpc error: code = NotFound desc = could not find container \"69195c38452664497c58d83ef89dfb71c364ad77833c061989f61a599532b13c\": container with ID starting with 69195c38452664497c58d83ef89dfb71c364ad77833c061989f61a599532b13c not found: ID does not exist" Apr 17 17:46:24.318482 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.318468 2546 scope.go:117] "RemoveContainer" containerID="b2cff7f7d155f77ed4d6ae522de9c5869d6f2c5f752ecf3f6afd75081ae9f0c8" Apr 17 17:46:24.318727 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:46:24.318705 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2cff7f7d155f77ed4d6ae522de9c5869d6f2c5f752ecf3f6afd75081ae9f0c8\": container with ID starting with b2cff7f7d155f77ed4d6ae522de9c5869d6f2c5f752ecf3f6afd75081ae9f0c8 not found: ID does not exist" containerID="b2cff7f7d155f77ed4d6ae522de9c5869d6f2c5f752ecf3f6afd75081ae9f0c8" Apr 17 17:46:24.318782 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.318735 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2cff7f7d155f77ed4d6ae522de9c5869d6f2c5f752ecf3f6afd75081ae9f0c8"} err="failed to get container status \"b2cff7f7d155f77ed4d6ae522de9c5869d6f2c5f752ecf3f6afd75081ae9f0c8\": rpc error: code = NotFound desc = could not find container \"b2cff7f7d155f77ed4d6ae522de9c5869d6f2c5f752ecf3f6afd75081ae9f0c8\": container with ID starting with b2cff7f7d155f77ed4d6ae522de9c5869d6f2c5f752ecf3f6afd75081ae9f0c8 not found: ID does not exist" Apr 17 17:46:24.334935 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.334907 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7"] Apr 17 17:46:24.342469 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.342440 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-f4bb2d-predictor-57fbff75d7-27gk7"] Apr 17 17:46:24.378160 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.378134 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/495db916-37c3-4257-ac7e-7d86b6638fcc-proxy-tls\") pod \"raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:24.380642 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.380618 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/495db916-37c3-4257-ac7e-7d86b6638fcc-proxy-tls\") pod \"raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:24.657327 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.657240 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:24.777726 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:24.777634 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j"] Apr 17 17:46:24.780639 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:46:24.780609 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod495db916_37c3_4257_ac7e_7d86b6638fcc.slice/crio-dd65221a2392673eb2fb4cf0f755f6894f579cfb7cef231dfad412b59a675cbf WatchSource:0}: Error finding container dd65221a2392673eb2fb4cf0f755f6894f579cfb7cef231dfad412b59a675cbf: Status 404 returned error can't find the container with id dd65221a2392673eb2fb4cf0f755f6894f579cfb7cef231dfad412b59a675cbf Apr 17 17:46:25.305080 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:25.305045 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" event={"ID":"495db916-37c3-4257-ac7e-7d86b6638fcc","Type":"ContainerStarted","Data":"03913b4106806e65cb2c6ad7910f78dde6d8052f3743fe464bcc6a888dfaacd4"} Apr 17 17:46:25.305080 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:25.305082 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" event={"ID":"495db916-37c3-4257-ac7e-7d86b6638fcc","Type":"ContainerStarted","Data":"dd65221a2392673eb2fb4cf0f755f6894f579cfb7cef231dfad412b59a675cbf"} Apr 17 17:46:25.954593 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:25.954559 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b824ed2-0ab5-48d9-97ed-324eb2d248b3" path="/var/lib/kubelet/pods/5b824ed2-0ab5-48d9-97ed-324eb2d248b3/volumes" Apr 17 17:46:29.320186 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:29.320149 2546 generic.go:358] "Generic (PLEG): container finished" podID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerID="03913b4106806e65cb2c6ad7910f78dde6d8052f3743fe464bcc6a888dfaacd4" exitCode=0 Apr 17 17:46:29.320569 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:29.320224 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" event={"ID":"495db916-37c3-4257-ac7e-7d86b6638fcc","Type":"ContainerDied","Data":"03913b4106806e65cb2c6ad7910f78dde6d8052f3743fe464bcc6a888dfaacd4"} Apr 17 17:46:30.325169 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:30.325135 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" event={"ID":"495db916-37c3-4257-ac7e-7d86b6638fcc","Type":"ContainerStarted","Data":"675952fa6273401f84e61dff465cb7ee260292fdb4fc0678707ba1c062ced093"} Apr 17 17:46:30.325560 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:30.325178 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" event={"ID":"495db916-37c3-4257-ac7e-7d86b6638fcc","Type":"ContainerStarted","Data":"51ff2363c065aa40ef9becc43f375cc96e03bc09fb8290774c8b5f8f4dc0aef7"} Apr 17 17:46:30.325560 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:30.325375 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:30.345081 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:30.345019 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" podStartSLOduration=7.344982096 podStartE2EDuration="7.344982096s" podCreationTimestamp="2026-04-17 17:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:46:30.343138196 +0000 UTC m=+1282.965418897" watchObservedRunningTime="2026-04-17 17:46:30.344982096 +0000 UTC m=+1282.967262771" Apr 17 17:46:31.328920 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:31.328878 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:31.330071 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:31.330041 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 17 17:46:32.332324 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:32.332285 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 17 17:46:37.336560 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:37.336525 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:46:37.337070 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:37.337036 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 17 17:46:47.337334 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:47.337292 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 17 17:46:57.337611 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:46:57.337570 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 17 17:47:07.337176 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:07.337132 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 17 17:47:17.337333 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:17.337294 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 17 17:47:27.337149 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:27.337106 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 17 17:47:37.337859 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:37.337828 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:47:43.828899 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:43.828840 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j"] Apr 17 17:47:43.829876 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:43.829818 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kserve-container" containerID="cri-o://51ff2363c065aa40ef9becc43f375cc96e03bc09fb8290774c8b5f8f4dc0aef7" gracePeriod=30 Apr 17 17:47:43.830110 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:43.829914 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kube-rbac-proxy" containerID="cri-o://675952fa6273401f84e61dff465cb7ee260292fdb4fc0678707ba1c062ced093" gracePeriod=30 Apr 17 17:47:43.900393 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:43.900356 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck"] Apr 17 17:47:43.900815 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:43.900797 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b824ed2-0ab5-48d9-97ed-324eb2d248b3" containerName="storage-initializer" Apr 17 17:47:43.900815 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:43.900816 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b824ed2-0ab5-48d9-97ed-324eb2d248b3" containerName="storage-initializer" Apr 17 17:47:43.900976 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:43.900851 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b824ed2-0ab5-48d9-97ed-324eb2d248b3" containerName="storage-initializer" Apr 17 17:47:43.900976 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:43.900860 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b824ed2-0ab5-48d9-97ed-324eb2d248b3" containerName="storage-initializer" Apr 17 17:47:43.900976 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:43.900945 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b824ed2-0ab5-48d9-97ed-324eb2d248b3" containerName="storage-initializer" Apr 17 17:47:43.901134 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:43.901117 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b824ed2-0ab5-48d9-97ed-324eb2d248b3" containerName="storage-initializer" Apr 17 17:47:43.904430 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:43.904408 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:43.906576 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:43.906557 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-f3b83-predictor-serving-cert\"" Apr 17 17:47:43.906715 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:43.906582 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-f3b83-kube-rbac-proxy-sar-config\"" Apr 17 17:47:43.914828 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:43.914806 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck"] Apr 17 17:47:44.077374 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.077337 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-runtime-f3b83-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ec8e92f8-bb79-4bba-8450-3dfedc04304e-raw-sklearn-runtime-f3b83-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:44.077374 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.077379 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsw9c\" (UniqueName: \"kubernetes.io/projected/ec8e92f8-bb79-4bba-8450-3dfedc04304e-kube-api-access-qsw9c\") pod \"raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:44.077599 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.077463 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec8e92f8-bb79-4bba-8450-3dfedc04304e-kserve-provision-location\") pod \"raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:44.077599 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.077506 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec8e92f8-bb79-4bba-8450-3dfedc04304e-proxy-tls\") pod \"raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:44.178637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.178536 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsw9c\" (UniqueName: \"kubernetes.io/projected/ec8e92f8-bb79-4bba-8450-3dfedc04304e-kube-api-access-qsw9c\") pod \"raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:44.178637 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.178607 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec8e92f8-bb79-4bba-8450-3dfedc04304e-kserve-provision-location\") pod \"raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:44.178936 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.178640 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec8e92f8-bb79-4bba-8450-3dfedc04304e-proxy-tls\") pod \"raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:44.178936 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.178710 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-runtime-f3b83-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ec8e92f8-bb79-4bba-8450-3dfedc04304e-raw-sklearn-runtime-f3b83-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:44.179056 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.179032 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec8e92f8-bb79-4bba-8450-3dfedc04304e-kserve-provision-location\") pod \"raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:44.179393 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.179375 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-runtime-f3b83-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ec8e92f8-bb79-4bba-8450-3dfedc04304e-raw-sklearn-runtime-f3b83-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:44.181066 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.181038 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec8e92f8-bb79-4bba-8450-3dfedc04304e-proxy-tls\") pod \"raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:44.187928 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.187899 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsw9c\" (UniqueName: \"kubernetes.io/projected/ec8e92f8-bb79-4bba-8450-3dfedc04304e-kube-api-access-qsw9c\") pod \"raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:44.215851 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.215818 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:44.339345 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.339313 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck"] Apr 17 17:47:44.340805 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:47:44.340777 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec8e92f8_bb79_4bba_8450_3dfedc04304e.slice/crio-b54cf03669dec34fbbe3047eca455614ae61412615fdbd84c79dc53603240db1 WatchSource:0}: Error finding container b54cf03669dec34fbbe3047eca455614ae61412615fdbd84c79dc53603240db1: Status 404 returned error can't find the container with id b54cf03669dec34fbbe3047eca455614ae61412615fdbd84c79dc53603240db1 Apr 17 17:47:44.594230 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.594192 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" event={"ID":"ec8e92f8-bb79-4bba-8450-3dfedc04304e","Type":"ContainerStarted","Data":"a377e1a9ba0a90d75f04a83273a97aa4ac621ab83ec82291cdba5f616bd5e2df"} Apr 17 17:47:44.594230 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.594235 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" event={"ID":"ec8e92f8-bb79-4bba-8450-3dfedc04304e","Type":"ContainerStarted","Data":"b54cf03669dec34fbbe3047eca455614ae61412615fdbd84c79dc53603240db1"} Apr 17 17:47:44.596157 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.596135 2546 generic.go:358] "Generic (PLEG): container finished" podID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerID="675952fa6273401f84e61dff465cb7ee260292fdb4fc0678707ba1c062ced093" exitCode=2 Apr 17 17:47:44.596268 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:44.596170 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" event={"ID":"495db916-37c3-4257-ac7e-7d86b6638fcc","Type":"ContainerDied","Data":"675952fa6273401f84e61dff465cb7ee260292fdb4fc0678707ba1c062ced093"} Apr 17 17:47:47.332734 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:47.332671 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.43:8643/healthz\": dial tcp 10.134.0.43:8643: connect: connection refused" Apr 17 17:47:47.336979 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:47.336949 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 17 17:47:48.073754 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.073726 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:47:48.210645 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.210546 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-b1f8f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/495db916-37c3-4257-ac7e-7d86b6638fcc-raw-sklearn-b1f8f-kube-rbac-proxy-sar-config\") pod \"495db916-37c3-4257-ac7e-7d86b6638fcc\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " Apr 17 17:47:48.210645 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.210628 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/495db916-37c3-4257-ac7e-7d86b6638fcc-proxy-tls\") pod \"495db916-37c3-4257-ac7e-7d86b6638fcc\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " Apr 17 17:47:48.210902 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.210710 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/495db916-37c3-4257-ac7e-7d86b6638fcc-kserve-provision-location\") pod \"495db916-37c3-4257-ac7e-7d86b6638fcc\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " Apr 17 17:47:48.210902 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.210754 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzbsk\" (UniqueName: \"kubernetes.io/projected/495db916-37c3-4257-ac7e-7d86b6638fcc-kube-api-access-qzbsk\") pod \"495db916-37c3-4257-ac7e-7d86b6638fcc\" (UID: \"495db916-37c3-4257-ac7e-7d86b6638fcc\") " Apr 17 17:47:48.211030 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.211000 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/495db916-37c3-4257-ac7e-7d86b6638fcc-raw-sklearn-b1f8f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-b1f8f-kube-rbac-proxy-sar-config") pod "495db916-37c3-4257-ac7e-7d86b6638fcc" (UID: "495db916-37c3-4257-ac7e-7d86b6638fcc"). InnerVolumeSpecName "raw-sklearn-b1f8f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:47:48.211088 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.211057 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/495db916-37c3-4257-ac7e-7d86b6638fcc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "495db916-37c3-4257-ac7e-7d86b6638fcc" (UID: "495db916-37c3-4257-ac7e-7d86b6638fcc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:47:48.212811 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.212779 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/495db916-37c3-4257-ac7e-7d86b6638fcc-kube-api-access-qzbsk" (OuterVolumeSpecName: "kube-api-access-qzbsk") pod "495db916-37c3-4257-ac7e-7d86b6638fcc" (UID: "495db916-37c3-4257-ac7e-7d86b6638fcc"). InnerVolumeSpecName "kube-api-access-qzbsk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:47:48.212923 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.212809 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/495db916-37c3-4257-ac7e-7d86b6638fcc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "495db916-37c3-4257-ac7e-7d86b6638fcc" (UID: "495db916-37c3-4257-ac7e-7d86b6638fcc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:47:48.311773 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.311745 2546 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-b1f8f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/495db916-37c3-4257-ac7e-7d86b6638fcc-raw-sklearn-b1f8f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:47:48.311873 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.311775 2546 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/495db916-37c3-4257-ac7e-7d86b6638fcc-proxy-tls\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:47:48.311873 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.311788 2546 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/495db916-37c3-4257-ac7e-7d86b6638fcc-kserve-provision-location\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:47:48.311873 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.311797 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qzbsk\" (UniqueName: \"kubernetes.io/projected/495db916-37c3-4257-ac7e-7d86b6638fcc-kube-api-access-qzbsk\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:47:48.610822 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.610785 2546 generic.go:358] "Generic (PLEG): container finished" podID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerID="a377e1a9ba0a90d75f04a83273a97aa4ac621ab83ec82291cdba5f616bd5e2df" exitCode=0 Apr 17 17:47:48.611223 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.610858 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" event={"ID":"ec8e92f8-bb79-4bba-8450-3dfedc04304e","Type":"ContainerDied","Data":"a377e1a9ba0a90d75f04a83273a97aa4ac621ab83ec82291cdba5f616bd5e2df"} Apr 17 17:47:48.612595 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.612571 2546 generic.go:358] "Generic (PLEG): container finished" podID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerID="51ff2363c065aa40ef9becc43f375cc96e03bc09fb8290774c8b5f8f4dc0aef7" exitCode=0 Apr 17 17:47:48.612719 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.612637 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" event={"ID":"495db916-37c3-4257-ac7e-7d86b6638fcc","Type":"ContainerDied","Data":"51ff2363c065aa40ef9becc43f375cc96e03bc09fb8290774c8b5f8f4dc0aef7"} Apr 17 17:47:48.612719 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.612666 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" event={"ID":"495db916-37c3-4257-ac7e-7d86b6638fcc","Type":"ContainerDied","Data":"dd65221a2392673eb2fb4cf0f755f6894f579cfb7cef231dfad412b59a675cbf"} Apr 17 17:47:48.612719 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.612700 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j" Apr 17 17:47:48.612719 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.612712 2546 scope.go:117] "RemoveContainer" containerID="675952fa6273401f84e61dff465cb7ee260292fdb4fc0678707ba1c062ced093" Apr 17 17:47:48.621156 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.621139 2546 scope.go:117] "RemoveContainer" containerID="51ff2363c065aa40ef9becc43f375cc96e03bc09fb8290774c8b5f8f4dc0aef7" Apr 17 17:47:48.628172 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.628144 2546 scope.go:117] "RemoveContainer" containerID="03913b4106806e65cb2c6ad7910f78dde6d8052f3743fe464bcc6a888dfaacd4" Apr 17 17:47:48.637702 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.637667 2546 scope.go:117] "RemoveContainer" containerID="675952fa6273401f84e61dff465cb7ee260292fdb4fc0678707ba1c062ced093" Apr 17 17:47:48.637945 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:47:48.637929 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"675952fa6273401f84e61dff465cb7ee260292fdb4fc0678707ba1c062ced093\": container with ID starting with 675952fa6273401f84e61dff465cb7ee260292fdb4fc0678707ba1c062ced093 not found: ID does not exist" containerID="675952fa6273401f84e61dff465cb7ee260292fdb4fc0678707ba1c062ced093" Apr 17 17:47:48.637991 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.637952 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"675952fa6273401f84e61dff465cb7ee260292fdb4fc0678707ba1c062ced093"} err="failed to get container status \"675952fa6273401f84e61dff465cb7ee260292fdb4fc0678707ba1c062ced093\": rpc error: code = NotFound desc = could not find container \"675952fa6273401f84e61dff465cb7ee260292fdb4fc0678707ba1c062ced093\": container with ID starting with 675952fa6273401f84e61dff465cb7ee260292fdb4fc0678707ba1c062ced093 not found: ID does not exist" Apr 17 17:47:48.637991 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.637967 2546 scope.go:117] "RemoveContainer" containerID="51ff2363c065aa40ef9becc43f375cc96e03bc09fb8290774c8b5f8f4dc0aef7" Apr 17 17:47:48.638217 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:47:48.638198 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51ff2363c065aa40ef9becc43f375cc96e03bc09fb8290774c8b5f8f4dc0aef7\": container with ID starting with 51ff2363c065aa40ef9becc43f375cc96e03bc09fb8290774c8b5f8f4dc0aef7 not found: ID does not exist" containerID="51ff2363c065aa40ef9becc43f375cc96e03bc09fb8290774c8b5f8f4dc0aef7" Apr 17 17:47:48.638264 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.638225 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51ff2363c065aa40ef9becc43f375cc96e03bc09fb8290774c8b5f8f4dc0aef7"} err="failed to get container status \"51ff2363c065aa40ef9becc43f375cc96e03bc09fb8290774c8b5f8f4dc0aef7\": rpc error: code = NotFound desc = could not find container \"51ff2363c065aa40ef9becc43f375cc96e03bc09fb8290774c8b5f8f4dc0aef7\": container with ID starting with 51ff2363c065aa40ef9becc43f375cc96e03bc09fb8290774c8b5f8f4dc0aef7 not found: ID does not exist" Apr 17 17:47:48.638264 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.638242 2546 scope.go:117] "RemoveContainer" containerID="03913b4106806e65cb2c6ad7910f78dde6d8052f3743fe464bcc6a888dfaacd4" Apr 17 17:47:48.638476 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:47:48.638457 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03913b4106806e65cb2c6ad7910f78dde6d8052f3743fe464bcc6a888dfaacd4\": container with ID starting with 03913b4106806e65cb2c6ad7910f78dde6d8052f3743fe464bcc6a888dfaacd4 not found: ID does not exist" containerID="03913b4106806e65cb2c6ad7910f78dde6d8052f3743fe464bcc6a888dfaacd4" Apr 17 17:47:48.638529 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.638480 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03913b4106806e65cb2c6ad7910f78dde6d8052f3743fe464bcc6a888dfaacd4"} err="failed to get container status \"03913b4106806e65cb2c6ad7910f78dde6d8052f3743fe464bcc6a888dfaacd4\": rpc error: code = NotFound desc = could not find container \"03913b4106806e65cb2c6ad7910f78dde6d8052f3743fe464bcc6a888dfaacd4\": container with ID starting with 03913b4106806e65cb2c6ad7910f78dde6d8052f3743fe464bcc6a888dfaacd4 not found: ID does not exist" Apr 17 17:47:48.645924 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.645901 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j"] Apr 17 17:47:48.650534 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:48.650511 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-b1f8f-predictor-5df5d48cb5-ls88j"] Apr 17 17:47:49.619075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:49.619037 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" event={"ID":"ec8e92f8-bb79-4bba-8450-3dfedc04304e","Type":"ContainerStarted","Data":"49280ff6d5d7475e3586da3be08c64597dc214de782aabda6cb06d65b6998cc1"} Apr 17 17:47:49.619075 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:49.619080 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" event={"ID":"ec8e92f8-bb79-4bba-8450-3dfedc04304e","Type":"ContainerStarted","Data":"3a13f17e3da88faa17607a13621d33a57bb28a21dc31fa33a7b4ba28f2d71b9c"} Apr 17 17:47:49.619663 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:49.619340 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:49.639797 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:49.639750 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" podStartSLOduration=6.639737515 podStartE2EDuration="6.639737515s" podCreationTimestamp="2026-04-17 17:47:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:47:49.638221357 +0000 UTC m=+1362.260502032" watchObservedRunningTime="2026-04-17 17:47:49.639737515 +0000 UTC m=+1362.262018189" Apr 17 17:47:49.954509 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:49.954428 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" path="/var/lib/kubelet/pods/495db916-37c3-4257-ac7e-7d86b6638fcc/volumes" Apr 17 17:47:50.623643 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:50.623613 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:50.624516 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:50.624492 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 17 17:47:51.626953 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:51.626910 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 17 17:47:56.631663 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:56.631634 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:47:56.632197 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:47:56.632171 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 17 17:48:06.632300 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:48:06.632257 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 17 17:48:16.632414 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:48:16.632372 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 17 17:48:26.632535 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:48:26.632488 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 17 17:48:36.632828 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:48:36.632785 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 17 17:48:46.632857 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:48:46.632817 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 17 17:48:56.633792 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:48:56.633721 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:49:04.014853 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:04.014820 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck"] Apr 17 17:49:04.015233 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:04.015135 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kserve-container" containerID="cri-o://3a13f17e3da88faa17607a13621d33a57bb28a21dc31fa33a7b4ba28f2d71b9c" gracePeriod=30 Apr 17 17:49:04.015233 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:04.015168 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kube-rbac-proxy" containerID="cri-o://49280ff6d5d7475e3586da3be08c64597dc214de782aabda6cb06d65b6998cc1" gracePeriod=30 Apr 17 17:49:04.883657 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:04.883623 2546 generic.go:358] "Generic (PLEG): container finished" podID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerID="49280ff6d5d7475e3586da3be08c64597dc214de782aabda6cb06d65b6998cc1" exitCode=2 Apr 17 17:49:04.883836 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:04.883671 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" event={"ID":"ec8e92f8-bb79-4bba-8450-3dfedc04304e","Type":"ContainerDied","Data":"49280ff6d5d7475e3586da3be08c64597dc214de782aabda6cb06d65b6998cc1"} Apr 17 17:49:06.627767 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:06.627725 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.44:8643/healthz\": dial tcp 10.134.0.44:8643: connect: connection refused" Apr 17 17:49:06.633112 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:06.633090 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 17 17:49:08.358520 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.358493 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:49:08.415702 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.415611 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec8e92f8-bb79-4bba-8450-3dfedc04304e-proxy-tls\") pod \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " Apr 17 17:49:08.415702 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.415656 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsw9c\" (UniqueName: \"kubernetes.io/projected/ec8e92f8-bb79-4bba-8450-3dfedc04304e-kube-api-access-qsw9c\") pod \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " Apr 17 17:49:08.415905 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.415736 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-runtime-f3b83-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ec8e92f8-bb79-4bba-8450-3dfedc04304e-raw-sklearn-runtime-f3b83-kube-rbac-proxy-sar-config\") pod \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " Apr 17 17:49:08.415905 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.415780 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec8e92f8-bb79-4bba-8450-3dfedc04304e-kserve-provision-location\") pod \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\" (UID: \"ec8e92f8-bb79-4bba-8450-3dfedc04304e\") " Apr 17 17:49:08.416177 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.416150 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8e92f8-bb79-4bba-8450-3dfedc04304e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ec8e92f8-bb79-4bba-8450-3dfedc04304e" (UID: "ec8e92f8-bb79-4bba-8450-3dfedc04304e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:49:08.416244 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.416153 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec8e92f8-bb79-4bba-8450-3dfedc04304e-raw-sklearn-runtime-f3b83-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-runtime-f3b83-kube-rbac-proxy-sar-config") pod "ec8e92f8-bb79-4bba-8450-3dfedc04304e" (UID: "ec8e92f8-bb79-4bba-8450-3dfedc04304e"). InnerVolumeSpecName "raw-sklearn-runtime-f3b83-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:49:08.417652 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.417627 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8e92f8-bb79-4bba-8450-3dfedc04304e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ec8e92f8-bb79-4bba-8450-3dfedc04304e" (UID: "ec8e92f8-bb79-4bba-8450-3dfedc04304e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:49:08.417886 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.417871 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8e92f8-bb79-4bba-8450-3dfedc04304e-kube-api-access-qsw9c" (OuterVolumeSpecName: "kube-api-access-qsw9c") pod "ec8e92f8-bb79-4bba-8450-3dfedc04304e" (UID: "ec8e92f8-bb79-4bba-8450-3dfedc04304e"). InnerVolumeSpecName "kube-api-access-qsw9c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:49:08.516961 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.516923 2546 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-runtime-f3b83-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ec8e92f8-bb79-4bba-8450-3dfedc04304e-raw-sklearn-runtime-f3b83-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:49:08.516961 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.516955 2546 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec8e92f8-bb79-4bba-8450-3dfedc04304e-kserve-provision-location\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:49:08.516961 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.516966 2546 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec8e92f8-bb79-4bba-8450-3dfedc04304e-proxy-tls\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:49:08.517188 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.516977 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qsw9c\" (UniqueName: \"kubernetes.io/projected/ec8e92f8-bb79-4bba-8450-3dfedc04304e-kube-api-access-qsw9c\") on node \"ip-10-0-137-46.ec2.internal\" DevicePath \"\"" Apr 17 17:49:08.900901 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.900863 2546 generic.go:358] "Generic (PLEG): container finished" podID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerID="3a13f17e3da88faa17607a13621d33a57bb28a21dc31fa33a7b4ba28f2d71b9c" exitCode=0 Apr 17 17:49:08.901061 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.900921 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" event={"ID":"ec8e92f8-bb79-4bba-8450-3dfedc04304e","Type":"ContainerDied","Data":"3a13f17e3da88faa17607a13621d33a57bb28a21dc31fa33a7b4ba28f2d71b9c"} Apr 17 17:49:08.901061 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.900955 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" event={"ID":"ec8e92f8-bb79-4bba-8450-3dfedc04304e","Type":"ContainerDied","Data":"b54cf03669dec34fbbe3047eca455614ae61412615fdbd84c79dc53603240db1"} Apr 17 17:49:08.901061 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.900955 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck" Apr 17 17:49:08.901061 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.900969 2546 scope.go:117] "RemoveContainer" containerID="49280ff6d5d7475e3586da3be08c64597dc214de782aabda6cb06d65b6998cc1" Apr 17 17:49:08.909734 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.909713 2546 scope.go:117] "RemoveContainer" containerID="3a13f17e3da88faa17607a13621d33a57bb28a21dc31fa33a7b4ba28f2d71b9c" Apr 17 17:49:08.916563 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.916546 2546 scope.go:117] "RemoveContainer" containerID="a377e1a9ba0a90d75f04a83273a97aa4ac621ab83ec82291cdba5f616bd5e2df" Apr 17 17:49:08.923985 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.923960 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck"] Apr 17 17:49:08.924800 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.924777 2546 scope.go:117] "RemoveContainer" containerID="49280ff6d5d7475e3586da3be08c64597dc214de782aabda6cb06d65b6998cc1" Apr 17 17:49:08.925061 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:49:08.925040 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49280ff6d5d7475e3586da3be08c64597dc214de782aabda6cb06d65b6998cc1\": container with ID starting with 49280ff6d5d7475e3586da3be08c64597dc214de782aabda6cb06d65b6998cc1 not found: ID does not exist" containerID="49280ff6d5d7475e3586da3be08c64597dc214de782aabda6cb06d65b6998cc1" Apr 17 17:49:08.925125 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.925072 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49280ff6d5d7475e3586da3be08c64597dc214de782aabda6cb06d65b6998cc1"} err="failed to get container status \"49280ff6d5d7475e3586da3be08c64597dc214de782aabda6cb06d65b6998cc1\": rpc error: code = NotFound desc = could not find container \"49280ff6d5d7475e3586da3be08c64597dc214de782aabda6cb06d65b6998cc1\": container with ID starting with 49280ff6d5d7475e3586da3be08c64597dc214de782aabda6cb06d65b6998cc1 not found: ID does not exist" Apr 17 17:49:08.925125 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.925094 2546 scope.go:117] "RemoveContainer" containerID="3a13f17e3da88faa17607a13621d33a57bb28a21dc31fa33a7b4ba28f2d71b9c" Apr 17 17:49:08.925324 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:49:08.925307 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a13f17e3da88faa17607a13621d33a57bb28a21dc31fa33a7b4ba28f2d71b9c\": container with ID starting with 3a13f17e3da88faa17607a13621d33a57bb28a21dc31fa33a7b4ba28f2d71b9c not found: ID does not exist" containerID="3a13f17e3da88faa17607a13621d33a57bb28a21dc31fa33a7b4ba28f2d71b9c" Apr 17 17:49:08.925374 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.925341 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a13f17e3da88faa17607a13621d33a57bb28a21dc31fa33a7b4ba28f2d71b9c"} err="failed to get container status \"3a13f17e3da88faa17607a13621d33a57bb28a21dc31fa33a7b4ba28f2d71b9c\": rpc error: code = NotFound desc = could not find container \"3a13f17e3da88faa17607a13621d33a57bb28a21dc31fa33a7b4ba28f2d71b9c\": container with ID starting with 3a13f17e3da88faa17607a13621d33a57bb28a21dc31fa33a7b4ba28f2d71b9c not found: ID does not exist" Apr 17 17:49:08.925374 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.925357 2546 scope.go:117] "RemoveContainer" containerID="a377e1a9ba0a90d75f04a83273a97aa4ac621ab83ec82291cdba5f616bd5e2df" Apr 17 17:49:08.925580 ip-10-0-137-46 kubenswrapper[2546]: E0417 17:49:08.925563 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a377e1a9ba0a90d75f04a83273a97aa4ac621ab83ec82291cdba5f616bd5e2df\": container with ID starting with a377e1a9ba0a90d75f04a83273a97aa4ac621ab83ec82291cdba5f616bd5e2df not found: ID does not exist" containerID="a377e1a9ba0a90d75f04a83273a97aa4ac621ab83ec82291cdba5f616bd5e2df" Apr 17 17:49:08.925634 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.925583 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a377e1a9ba0a90d75f04a83273a97aa4ac621ab83ec82291cdba5f616bd5e2df"} err="failed to get container status \"a377e1a9ba0a90d75f04a83273a97aa4ac621ab83ec82291cdba5f616bd5e2df\": rpc error: code = NotFound desc = could not find container \"a377e1a9ba0a90d75f04a83273a97aa4ac621ab83ec82291cdba5f616bd5e2df\": container with ID starting with a377e1a9ba0a90d75f04a83273a97aa4ac621ab83ec82291cdba5f616bd5e2df not found: ID does not exist" Apr 17 17:49:08.926065 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:08.926048 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-f3b83-predictor-797897c69c-7v4ck"] Apr 17 17:49:09.953317 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:09.953281 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" path="/var/lib/kubelet/pods/ec8e92f8-bb79-4bba-8450-3dfedc04304e/volumes" Apr 17 17:49:32.395946 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:32.395917 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rrr95_b7bb99d1-76a8-40ac-9a30-1ebde78e79f3/global-pull-secret-syncer/0.log" Apr 17 17:49:32.486476 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:32.486448 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-sf4kc_e9676b31-a86d-4645-96b9-1bfa16b53a94/konnectivity-agent/0.log" Apr 17 17:49:32.596522 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:32.596485 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-46.ec2.internal_bec88f0e450f3cfd4f0ac310ffbc3b96/haproxy/0.log" Apr 17 17:49:35.601483 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:35.601449 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_289f3232-3555-4e14-a1f1-ef291fa65ef9/alertmanager/0.log" Apr 17 17:49:35.667021 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:35.666991 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_289f3232-3555-4e14-a1f1-ef291fa65ef9/config-reloader/0.log" Apr 17 17:49:35.714056 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:35.714022 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_289f3232-3555-4e14-a1f1-ef291fa65ef9/kube-rbac-proxy-web/0.log" Apr 17 17:49:35.739257 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:35.739232 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_289f3232-3555-4e14-a1f1-ef291fa65ef9/kube-rbac-proxy/0.log" Apr 17 17:49:35.764537 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:35.764509 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_289f3232-3555-4e14-a1f1-ef291fa65ef9/kube-rbac-proxy-metric/0.log" Apr 17 17:49:35.787227 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:35.787202 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_289f3232-3555-4e14-a1f1-ef291fa65ef9/prom-label-proxy/0.log" Apr 17 17:49:35.808474 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:35.808448 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_289f3232-3555-4e14-a1f1-ef291fa65ef9/init-config-reloader/0.log" Apr 17 17:49:35.855074 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:35.854991 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-8kt47_61d0dab8-9f71-4bee-b48b-178b647667dd/cluster-monitoring-operator/0.log" Apr 17 17:49:35.884855 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:35.884831 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7qddr_f62beaf3-38e8-43b7-8bda-61534c3eb9a3/kube-state-metrics/0.log" Apr 17 17:49:35.906744 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:35.906715 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7qddr_f62beaf3-38e8-43b7-8bda-61534c3eb9a3/kube-rbac-proxy-main/0.log" Apr 17 17:49:35.926467 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:35.926439 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7qddr_f62beaf3-38e8-43b7-8bda-61534c3eb9a3/kube-rbac-proxy-self/0.log" Apr 17 17:49:35.965694 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:35.965661 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-567bdb595f-b8hpp_aff002da-4836-41df-8ef7-6b2c94ee451b/metrics-server/0.log" Apr 17 17:49:35.995700 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:35.995650 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-xgv4w_0a6d5ca8-e84c-43e3-b204-3b6fd91ac786/monitoring-plugin/0.log" Apr 17 17:49:36.094864 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:36.094823 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bcdtv_2e480281-dc92-4d6c-99c6-1d7dbd41136d/node-exporter/0.log" Apr 17 17:49:36.127250 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:36.127174 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bcdtv_2e480281-dc92-4d6c-99c6-1d7dbd41136d/kube-rbac-proxy/0.log" Apr 17 17:49:36.178405 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:36.178380 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bcdtv_2e480281-dc92-4d6c-99c6-1d7dbd41136d/init-textfile/0.log" Apr 17 17:49:36.312374 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:36.312337 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-687v8_9fa27954-0b1c-4018-86c7-30ed361a6229/kube-rbac-proxy-main/0.log" Apr 17 17:49:36.333587 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:36.333559 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-687v8_9fa27954-0b1c-4018-86c7-30ed361a6229/kube-rbac-proxy-self/0.log" Apr 17 17:49:36.354137 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:36.354112 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-687v8_9fa27954-0b1c-4018-86c7-30ed361a6229/openshift-state-metrics/0.log" Apr 17 17:49:36.633154 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:36.633124 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8486f94db4-5fm4k_d4df2b4e-6557-426d-b06d-bda244334381/telemeter-client/0.log" Apr 17 17:49:36.653910 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:36.653887 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8486f94db4-5fm4k_d4df2b4e-6557-426d-b06d-bda244334381/reload/0.log" Apr 17 17:49:36.673490 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:36.673459 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8486f94db4-5fm4k_d4df2b4e-6557-426d-b06d-bda244334381/kube-rbac-proxy/0.log" Apr 17 17:49:38.778504 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:38.778472 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78565777f8-64tbj_b27b925a-4241-4612-8244-b96ad45d3c7b/console/0.log" Apr 17 17:49:39.199481 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.199451 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54"] Apr 17 17:49:39.199887 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.199871 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kserve-container" Apr 17 17:49:39.199995 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.199889 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kserve-container" Apr 17 17:49:39.199995 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.199908 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="storage-initializer" Apr 17 17:49:39.199995 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.199916 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="storage-initializer" Apr 17 17:49:39.199995 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.199930 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kube-rbac-proxy" Apr 17 17:49:39.199995 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.199938 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kube-rbac-proxy" Apr 17 17:49:39.199995 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.199955 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="storage-initializer" Apr 17 17:49:39.199995 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.199965 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="storage-initializer" Apr 17 17:49:39.199995 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.199983 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kserve-container" Apr 17 17:49:39.199995 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.199991 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kserve-container" Apr 17 17:49:39.200424 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.200003 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kube-rbac-proxy" Apr 17 17:49:39.200424 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.200011 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kube-rbac-proxy" Apr 17 17:49:39.200424 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.200096 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kube-rbac-proxy" Apr 17 17:49:39.200424 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.200111 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="495db916-37c3-4257-ac7e-7d86b6638fcc" containerName="kserve-container" Apr 17 17:49:39.200424 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.200121 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kserve-container" Apr 17 17:49:39.200424 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.200130 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec8e92f8-bb79-4bba-8450-3dfedc04304e" containerName="kube-rbac-proxy" Apr 17 17:49:39.203272 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.203251 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.205472 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.205446 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7ttz7\"/\"default-dockercfg-ml4hx\"" Apr 17 17:49:39.205787 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.205771 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7ttz7\"/\"openshift-service-ca.crt\"" Apr 17 17:49:39.206300 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.206285 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7ttz7\"/\"kube-root-ca.crt\"" Apr 17 17:49:39.217885 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.217863 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54"] Apr 17 17:49:39.281343 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.281309 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/714a1345-8266-46b9-9a57-df7a11a661a2-podres\") pod \"perf-node-gather-daemonset-qrh54\" (UID: \"714a1345-8266-46b9-9a57-df7a11a661a2\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.281550 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.281369 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/714a1345-8266-46b9-9a57-df7a11a661a2-proc\") pod \"perf-node-gather-daemonset-qrh54\" (UID: \"714a1345-8266-46b9-9a57-df7a11a661a2\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.281550 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.281413 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/714a1345-8266-46b9-9a57-df7a11a661a2-lib-modules\") pod \"perf-node-gather-daemonset-qrh54\" (UID: \"714a1345-8266-46b9-9a57-df7a11a661a2\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.281550 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.281447 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwhvk\" (UniqueName: \"kubernetes.io/projected/714a1345-8266-46b9-9a57-df7a11a661a2-kube-api-access-lwhvk\") pod \"perf-node-gather-daemonset-qrh54\" (UID: \"714a1345-8266-46b9-9a57-df7a11a661a2\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.281550 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.281516 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/714a1345-8266-46b9-9a57-df7a11a661a2-sys\") pod \"perf-node-gather-daemonset-qrh54\" (UID: \"714a1345-8266-46b9-9a57-df7a11a661a2\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.382556 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.382515 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/714a1345-8266-46b9-9a57-df7a11a661a2-podres\") pod \"perf-node-gather-daemonset-qrh54\" (UID: \"714a1345-8266-46b9-9a57-df7a11a661a2\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.382556 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.382564 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/714a1345-8266-46b9-9a57-df7a11a661a2-proc\") pod \"perf-node-gather-daemonset-qrh54\" (UID: \"714a1345-8266-46b9-9a57-df7a11a661a2\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.382835 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.382582 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/714a1345-8266-46b9-9a57-df7a11a661a2-lib-modules\") pod \"perf-node-gather-daemonset-qrh54\" (UID: \"714a1345-8266-46b9-9a57-df7a11a661a2\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.382835 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.382605 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwhvk\" (UniqueName: \"kubernetes.io/projected/714a1345-8266-46b9-9a57-df7a11a661a2-kube-api-access-lwhvk\") pod \"perf-node-gather-daemonset-qrh54\" (UID: \"714a1345-8266-46b9-9a57-df7a11a661a2\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.382835 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.382639 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/714a1345-8266-46b9-9a57-df7a11a661a2-sys\") pod \"perf-node-gather-daemonset-qrh54\" (UID: \"714a1345-8266-46b9-9a57-df7a11a661a2\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.382835 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.382663 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/714a1345-8266-46b9-9a57-df7a11a661a2-proc\") pod \"perf-node-gather-daemonset-qrh54\" (UID: \"714a1345-8266-46b9-9a57-df7a11a661a2\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.382835 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.382701 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/714a1345-8266-46b9-9a57-df7a11a661a2-podres\") pod \"perf-node-gather-daemonset-qrh54\" (UID: \"714a1345-8266-46b9-9a57-df7a11a661a2\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.382835 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.382741 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/714a1345-8266-46b9-9a57-df7a11a661a2-lib-modules\") pod \"perf-node-gather-daemonset-qrh54\" (UID: \"714a1345-8266-46b9-9a57-df7a11a661a2\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.382835 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.382780 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/714a1345-8266-46b9-9a57-df7a11a661a2-sys\") pod \"perf-node-gather-daemonset-qrh54\" (UID: \"714a1345-8266-46b9-9a57-df7a11a661a2\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.390702 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.390666 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwhvk\" (UniqueName: \"kubernetes.io/projected/714a1345-8266-46b9-9a57-df7a11a661a2-kube-api-access-lwhvk\") pod \"perf-node-gather-daemonset-qrh54\" (UID: \"714a1345-8266-46b9-9a57-df7a11a661a2\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.513209 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.513122 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:39.633779 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.633721 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54"] Apr 17 17:49:39.636246 ip-10-0-137-46 kubenswrapper[2546]: W0417 17:49:39.636209 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod714a1345_8266_46b9_9a57_df7a11a661a2.slice/crio-61b9ab797a27d2ccb49eb657625a74f79c0f10481ef12fa2e0b24d9e05adaa62 WatchSource:0}: Error finding container 61b9ab797a27d2ccb49eb657625a74f79c0f10481ef12fa2e0b24d9e05adaa62: Status 404 returned error can't find the container with id 61b9ab797a27d2ccb49eb657625a74f79c0f10481ef12fa2e0b24d9e05adaa62 Apr 17 17:49:39.934326 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.934300 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sm5m5_d82d71b1-2458-4671-b28c-5e3870cd761a/dns/0.log" Apr 17 17:49:39.954816 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:39.954795 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sm5m5_d82d71b1-2458-4671-b28c-5e3870cd761a/kube-rbac-proxy/0.log" Apr 17 17:49:40.005960 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:40.005931 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" event={"ID":"714a1345-8266-46b9-9a57-df7a11a661a2","Type":"ContainerStarted","Data":"d57b37fd5a452b5f3c555eb288f25052a65acecb8eff6d4a497ba5dca81b6f1d"} Apr 17 17:49:40.005960 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:40.005964 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" event={"ID":"714a1345-8266-46b9-9a57-df7a11a661a2","Type":"ContainerStarted","Data":"61b9ab797a27d2ccb49eb657625a74f79c0f10481ef12fa2e0b24d9e05adaa62"} Apr 17 17:49:40.006184 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:40.006086 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:40.018447 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:40.018424 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-r6rvm_a2e022ed-9ba3-454c-9f27-4b300f2393d6/dns-node-resolver/0.log" Apr 17 17:49:40.022840 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:40.022807 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" podStartSLOduration=1.022794582 podStartE2EDuration="1.022794582s" podCreationTimestamp="2026-04-17 17:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:49:40.021340283 +0000 UTC m=+1472.643620950" watchObservedRunningTime="2026-04-17 17:49:40.022794582 +0000 UTC m=+1472.645075257" Apr 17 17:49:40.404452 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:40.404421 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7cd7cd7b76-mbkvf_bf3f4e12-9e99-452e-8ff8-dd441d2d2b39/registry/0.log" Apr 17 17:49:40.424435 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:40.424404 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2phxj_8b1e9f99-746c-4f28-80b9-ea9eb814cd98/node-ca/0.log" Apr 17 17:49:41.158164 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:41.158128 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-cc85759b6-w8jxm_e0ddb199-4f09-4f38-9d09-304ed7807840/router/0.log" Apr 17 17:49:41.498791 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:41.498700 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kgnl9_e4397ebe-1923-4566-89e4-f777e71713b1/serve-healthcheck-canary/0.log" Apr 17 17:49:41.872850 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:41.872818 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-99kn4_e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc/insights-operator/0.log" Apr 17 17:49:41.873179 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:41.873163 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-99kn4_e8e28f97-fdcd-4cdd-bc14-75b5ce293dcc/insights-operator/1.log" Apr 17 17:49:42.047844 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:42.047817 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m2mcc_b7243405-fea2-48ef-80da-809f729864d2/kube-rbac-proxy/0.log" Apr 17 17:49:42.068345 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:42.068314 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m2mcc_b7243405-fea2-48ef-80da-809f729864d2/exporter/0.log" Apr 17 17:49:42.089556 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:42.089529 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m2mcc_b7243405-fea2-48ef-80da-809f729864d2/extractor/0.log" Apr 17 17:49:43.990734 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:43.990697 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-85dd7cfb4d-bbwsx_0dcae6a1-3266-4e8a-bc79-fcd36bffaa2c/manager/0.log" Apr 17 17:49:44.010390 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:44.010362 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-s5cxq_0d8ad29e-c7bc-476f-a723-5c96ac217a68/manager/0.log" Apr 17 17:49:46.019917 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:46.019891 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-qrh54" Apr 17 17:49:48.180992 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:48.180895 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-hpbfh_95a78b42-cd3a-409d-8ce7-11b8805103c6/kube-storage-version-migrator-operator/1.log" Apr 17 17:49:48.182287 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:48.182260 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-hpbfh_95a78b42-cd3a-409d-8ce7-11b8805103c6/kube-storage-version-migrator-operator/0.log" Apr 17 17:49:49.479037 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:49.479000 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q92kj_17cc3e64-292f-4b71-9d6f-6deb75cffce6/kube-multus-additional-cni-plugins/0.log" Apr 17 17:49:49.500982 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:49.500952 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q92kj_17cc3e64-292f-4b71-9d6f-6deb75cffce6/egress-router-binary-copy/0.log" Apr 17 17:49:49.523659 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:49.523631 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q92kj_17cc3e64-292f-4b71-9d6f-6deb75cffce6/cni-plugins/0.log" Apr 17 17:49:49.548426 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:49.548396 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q92kj_17cc3e64-292f-4b71-9d6f-6deb75cffce6/bond-cni-plugin/0.log" Apr 17 17:49:49.573137 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:49.573109 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q92kj_17cc3e64-292f-4b71-9d6f-6deb75cffce6/routeoverride-cni/0.log" Apr 17 17:49:49.596500 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:49.596469 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q92kj_17cc3e64-292f-4b71-9d6f-6deb75cffce6/whereabouts-cni-bincopy/0.log" Apr 17 17:49:49.618960 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:49.618933 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q92kj_17cc3e64-292f-4b71-9d6f-6deb75cffce6/whereabouts-cni/0.log" Apr 17 17:49:49.690924 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:49.690897 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qz758_7eb9cbde-0842-4e38-ab1b-0c93d220e92a/kube-multus/0.log" Apr 17 17:49:49.766638 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:49.766556 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fbmql_173598bb-6dcc-46e9-a78f-f3d5c1fd4297/network-metrics-daemon/0.log" Apr 17 17:49:49.787424 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:49.787389 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fbmql_173598bb-6dcc-46e9-a78f-f3d5c1fd4297/kube-rbac-proxy/0.log" Apr 17 17:49:50.597897 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:50.597863 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovn-controller/0.log" Apr 17 17:49:50.614733 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:50.614709 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovn-acl-logging/0.log" Apr 17 17:49:50.628639 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:50.628612 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovn-acl-logging/1.log" Apr 17 17:49:50.650636 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:50.650609 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/kube-rbac-proxy-node/0.log" Apr 17 17:49:50.672440 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:50.672420 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 17:49:50.689162 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:50.689145 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/northd/0.log" Apr 17 17:49:50.708274 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:50.708253 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/nbdb/0.log" Apr 17 17:49:50.731124 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:50.731096 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/sbdb/0.log" Apr 17 17:49:50.922563 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:50.922482 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ft44m_e869ba13-1af3-46e4-bbaa-eef8b748f612/ovnkube-controller/0.log" Apr 17 17:49:52.416174 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:52.416147 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zgwbf_fa8cdfa0-8080-411d-bd6e-51b977229392/network-check-target-container/0.log" Apr 17 17:49:53.320362 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:53.320328 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-b4h7q_19f42a69-1efb-4887-94b9-58c87acfe319/iptables-alerter/0.log" Apr 17 17:49:53.960785 ip-10-0-137-46 kubenswrapper[2546]: I0417 17:49:53.960748 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-wpsq6_53031ab0-20d5-45d9-8cf7-e2d6fc9ec15f/tuned/0.log"