-
-
Notifications
You must be signed in to change notification settings - Fork 11.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] GPT输出回答时界面操作异常卡顿 #2823
Comments
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
@Sun-drenched 帮忙确认下,点击其他操作按钮卡不卡? 例如导入配置、分享或者角色设定的编辑按钮。 |
你说的这三个不卡 |
The three you mentioned are not stuck. |
@Sun-drenched 那右上角的会话设置是不是会卡? |
是的。主观感受,卡顿程度和应用设置相同。 |
Yes. Subjectively, the degree of lag is the same as the application settings. |
那看来像是拦截路由的问题,得研究下有没有解法。 cc @canisminor1990 |
That seems to be a problem of interception routing. I need to research whether there is a solution. cc @canisminor1990 |
主要是看起来目前项目大量使用了 RSC,很多 UI 操作都要等待 RSC 响应。在本地开发,或者本机使用的时候没什么感觉,如果套一个 cloudflare tunnel 或者用延迟高一点的服务器,UI 操作等待时间会很长。理论上大部分 UI 操作都不应该等待服务器的响应才对? |
The main reason is that it seems that the current project uses RSC extensively, and many UI operations have to wait for RSC response. There is no feeling when developing locally or using this machine. If you set up a cloudflare tunnel or use a server with a higher latency, the waiting time for UI operations will be very long. In theory, most UI operations should not wait for a response from the server, right? |
@hjkcai 只能说是部分正确吧, 现在 LobeChat 的很多界面是可以通过服务端配置来控制的,我们有好多 Feature Flags 来控制是否显示某些元素(比如 WebRTC 同步、模型设置等等),在传统 带有 SSR 的 SPA 中是比较难做到很完美的前端开关控制的。 很显然 RSC 是非常适合这种诉求的技术方案。 这个 issue 的根因我粗估和 RSC 没啥关系,而可能是跟 nextjs 的拦截路由有关系。(和 RSC 没关系的原因就是在输出前和输出后点击都不卡,只有输出中才卡) |
@hjkcai I can only say that it is partially correct. Now many interfaces of LobeChat can be controlled through server configuration. We have many Feature Flags to control whether to display certain elements (such as WebRTC synchronization, model settings, etc.). In the traditional It is difficult to achieve perfect front-end switch control in a SPA with SSR. Obviously RSC is a technical solution that is very suitable for this requirement. I roughly estimate that the root cause of this issue has nothing to do with RSC, but may be related to the interception routing of nextjs. (The reason that has nothing to do with RSC is that there is no lag when clicking before and after output. It only lags during output) |
@arvinxx 或许可以在启动的时候一次性借助 RSC 拉取配置?每个 UI 交互都需要独立去拉取配置每次都有延迟。 交互延迟的问题跟这个 issue 应该没有直接关系,不过也挺影响用户体验的 |
@arvinxx Maybe you can use RSC to pull the configuration once at startup? Each UI interaction needs to pull the configuration independently, causing delays every time. The problem of interaction delay should not be directly related to this issue, but it does affect the user experience. |
嗯,这个是优化方案。要解决这个 issue 的问题,要做的事情就是得把之前 rsc 取 model 配置的逻辑换成你说的一次性拉取然后存到内存里,然后再把现在的拦截路由都改成普通 modal。 我们会在 1.0 发布后做这个优化。 |
Well, this is an optimization solution. To solve this issue, what needs to be done is to change the previous logic of rsc to get the model configuration into the one-time pull and save it into the memory as you mentioned, and then change the current interception routes to ordinary modal. We will do this optimization after 1.0 is released. |
这个可以麻烦你再开个 issue ,列下哪些地方会有你说的交互延迟的问题吗?之前已经修了很多地方了,我不确定还有哪里存在这个问题的 |
Could you please open another issue about this and list down where there is the interaction delay problem you mentioned? Many places have been fixed before, I'm not sure where else this problem exists. |
I have the same exact delay on mobile (android) too, but I also have extreme lag for response streaming. The text appears very slowly, and it takes far too long for it to finish, more than several minutes even on fastest models. (Meaning that the streaming has ended, but it's not yet finished displaying the response for some reason). It's a bit better for smaller chats with shorter history. |
I found a workaround for it, you just have to open the settings and wait a about 20-30 second for it to finish responding. Yes, that's a weird workaround, but it saves me 5-10 minutes for each answer (I'm serious, that's how long it takes usually). |
This issue is closed, If you have any questions, you can comment and reply. |
🎉 This issue has been resolved in version 1.15.19 🎉 The release is available on: Your semantic-release bot 📦🚀 |
📦 Environment
📌 Version
0.162.20
💻 Operating System
🌐 Browser
🐛 Bug Description
开启流式响应时,如果GPT返回的回答比较长(可能需要15-20s才能显示完毕),在回答输出期间,打开设置和设置中的操作过程是相当卡顿甚至于无响应的。一旦答案输出完毕就恢复正常。
没有开启客户端请求模式也会出现这个问题。在视频中,我复现了这个问题。在GPT输出回答之前,打开设置动作比较流畅(此Lobechat部署在国内服务器,不存在网络问题),一旦GPT开始输出回答,打开设置等界面上的操作异常卡顿,可以看到我连续点击设置很多次,直到回答输出完才弹出设置界面。
2024-06-09.163809.mp4
📷 Recurrence Steps
No response
🚦 Expected Behavior
No response
📝 Additional Information
No response
The text was updated successfully, but these errors were encountered: