-
Notifications
You must be signed in to change notification settings - Fork 653
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
canceling a stream many times on the same client in quick succession triggers a Received RST_STREAM with code 2 error #2872
Comments
I made a repository with the reproduction here: https://github.com/gunters63/reply-from-bug I added a variation of the reproduction using server streaming calls, this time it always fails after about 1450-1500 calls. Maybe this is a bug in Node or nghttp2 (the http/2 C lib Node is using)? |
I reproduced the problem also on Ubuntu 24.04 with Node v20.16 |
I converted the test to a pure http/2 equivalent (see pure-http2.js in the repo) and I still get the same error, this time after 1200 calls. So this is maybe an upstream problem in NodeJs. |
I think the behavior you are seeing is not a bug at all, but rather Node's mitigation for the HTTP/2 rapid reset vulnerability. You are likely seeing the GOAWAY after a different number of requests in each test because the limit depends on the exact rate of RST_STREAM frames, as explained in a comment on the change. The best way to avoid this problem is for your client to not cancel requests that it is handling normally. The intended use of cancellation is to handle unexpected situations on the client, when it decides after starting a call that it no longer needs the response. If the client knows in advance how many messages it wants from the server, it should communicate that information to the server in advance. Alternatively, it could half close on the client ( |
Thank you for that valuable information. I am not sure how I can handle this im my application: I have a UI running in a browser with dozens of React components on each screen. Each component wants to display a handful of sensor values in real time. These values are changing all the time and generate an endless stream of value change events. Because the application runs in a web browser it has to use grpc-web which limits streaming to server-side only. Each React component separately subscribes a handful of values in the grpc server backend through a server-side streaming call. The subscription pushes back the endless stream of value changes to the browser client. There can be dozens of streaming calls running concurrently at the same time. This works well because the browser and the grpc-web library automatically multiplexes the calls through a single HTTP/2 session. When the user interacts with the UI he changes screens, panels etc, those React components unmount and the subscriptions should be terminated from the client side. In fact this happens in a useEffect in each React component. I didn´t see any other way yet than to cancel the streaming calls using an AbortController. Is there a better way to do this to prevent those mitigations to happen? I cannot tell the server to terminate connections from his side because it is not a bidi stream. Until recently the browser application was talking directly to a gRPC on .NET Backend (.Net 8.0). The .NET backend never had a problem with this, everything was stable. But we wanted to route the grpc communication through a small HTTP forwarding proxy written in Node (using fastify and some plugins), that's when these problems showed up. |
I tried my reproduction program now with a Node version prior to the RST mitigation patch, and indeed, it runs without a problem: Node.js v23.4.0
❯ ~/Downloads/node-v20.6.1-linux-x64/bin/node -v
v20.6.1
❯ ~/Downloads/node-v20.6.1-linux-x64/bin/node pure-http2.js
HTTP/2 server is running on http://localhost:3000
10000Session closed |
And yes, you can actually configure the rate limiting in the option object of Nodes createServer call:
And if I change my createServer line to this: const server = http2.createServer({ streamResetBurst: Number.MAX_SAFE_INTEGER, streamResetRate: Number.MAX_SAFE_INTEGER }); the test runs to completion. |
This does change things. Your example was of a Node client making a bidirectional streaming request, so that is what I addressed in my response. You are correct that there is no way other than cancellation to stop an endless server-streaming call whenever the client wants to stop it. It would probably be possible to expose configuration options for those http2 server options in the gRPC Server constructor, but that won't happen until at least after the new year. Alternatively, you may just be able to have your client retry requests that fail this way. They could only be sent out on a new session, so they should succeed. |
My grpc Backend is not running under Node with grpc-js, so that wouldn't help with my problem, but maybe someone else has the same problem, so making this configurable surely won't hurt. My problem is that I have a Node program proxy-forwarding grpc HTTP/2 requests from the browser to a grpc .Net backend, and the forwarding fails after a while because the server tells to client to go away. Because I have to use HTTP/2 with TLS (browsers don't support plain) I have to use createSecureServer in the proxy:
but Nodes typings suddenly don't know about those properties (it works for plain createServer). I hope this is only bug in the typings :) |
Are you using grpc-js at all in your production setup? |
I use grpc-js through the excellent protobuf-ts package (https://github.com/timostamm/protobuf-ts). Hmm, now, that you mention it, while protobuf-ts has dependencies on grpc-js (didn´t see grpc-web), it maybe only uses it to import type definitions and enums when used in the browser through @protobuf-ts/grpcweb-transport. When implementing a Node grpc server with protobuf-ts you definitely need grpc-js at runtime. I implement a mock grpc server running on Node to test the UI. This uses grpc-js . The production grpc server though is implemented in gRPC on .NET. So, to answer your question, no, not at runtime. |
I tested the rate limiting parameters now with https createSecureServer, and they also work there: https://github.com/gunters63/reply-from-bug/blob/main/pure-https.js
They are not documented in the NodeJs documentation (only for http/2 createServer), and they are also missing in the Typescript definitions. I should report this as a bug in the Node repo I think. |
Problem description
I am currently debugging an issue I have with a HTTP proxy failing after a certain number of canceled grpc streaming calls.
So I tried to write a small reproduction, but I have problems even when no proxy is involved.
If I call a streaming call which I cancel server side many times in quick succession, the call fails with a
after roughly 1000 calls. If I insert a small delay between the calls it works.
Reproduction steps
comment out the the line
await new Promise((resolve) => setTimeout(resolve, 5));
and run node test.mjs:
Environment
The text was updated successfully, but these errors were encountered: