Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Too many subscriptions to Server.close event cause a OOM? #954

Open
2 tasks done
LRagji opened this issue Nov 27, 2023 · 11 comments
Open
2 tasks done

Too many subscriptions to Server.close event cause a OOM? #954

LRagji opened this issue Nov 27, 2023 · 11 comments

Comments

@LRagji
Copy link

LRagji commented Nov 27, 2023

Checks

Describe the bug (be clear and concise)

This is a observation which is pointing out to a potential OOM, in our application we are using this package as shown in your first example.

import * as express from 'express';
import { createProxyMiddleware, Filter, Options, RequestHandler } from 'http-proxy-middleware';

const app = express();

app.use('/api', createProxyMiddleware({ target: 'http://www.example.org', changeOrigin: true }));
app.listen(3000);

and we are stress testing this API with multiple requests, it has been observed that following code is attaching multiple event handlers on the server.close, which prompts node to think its a memory leak. MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 connected listeners added to [NativeConnection].

The obvious way to fix this is to create the middleware once capture it in a variable and use that as shown in your 3rd example on the readme page instead of recreating the middleware every time as shown by the first example.

If our conclusion is right then i think you should remove inline creation of the middleware from your examples, so that other don't fall for the same issue.

Step-by-step reproduction instructions

1. Create the example one as shown in the readme page. Node v18, Install latest of this package.
2. Make sure the upstream server takes 1 second to complete the request.
3. Nuke the api with any performance tool or stress test tool.

Expected behavior (be clear and concise)

Should not see this warning in the console MaxListenersExceededWarning: Possible EventEmitter memory leak detected

How is http-proxy-middleware used in your project?

As shown in example 1:


import * as express from 'express';
import { createProxyMiddleware, Filter, Options, RequestHandler } from 'http-proxy-middleware';

const app = express();

app.use('/api', createProxyMiddleware({ target: 'http://www.example.org', changeOrigin: true }));
app.listen(3000);


### What http-proxy-middleware configuration are you using?

```typescript
Default

What OS/version and node/version are you seeing the problem?

Node v18
OS: Node official container.

Additional context (optional)

No response

@khoale-groove
Copy link

same issue.

@f4z3k4s
Copy link

f4z3k4s commented Jan 30, 2024

@LRagji thanks, saved me some time! :)

@fetis
Copy link

fetis commented Apr 15, 2024

Any updates on this?

@Jokero
Copy link

Jokero commented Apr 30, 2024

I got the same error when used a proxy in Parcel

@rene-leanix
Copy link

We're also getting this warning message, @LRagji, but the code you linked should add the on-close-event-listener only once as this.serverOnCloseSubscribed will be set:

    if (server && !this.serverOnCloseSubscribed) {
      server.on('close', () => {
        debug('server close signal received: closing proxy server');
        this.proxy.close();
      });
      this.serverOnCloseSubscribed = true;
    }

Were you able to debug it?

@LRagji
Copy link
Author

LRagji commented Jun 10, 2024

@rene-leanix If i remember this correctly, Its about adding the event handler every-time a request is made..

@rene-leanix
Copy link

rene-leanix commented Jun 10, 2024

@LRagji, I just found out that in our code we're calling createProxyMiddleware() not just once but multiple times! That's the reason why we ran into the error message. Maybe its the same for others.

Its about adding the event handler every-time a request is made

But if you look at the code snippet above, I can't see how that could happen, if createProxyMiddleware() is only called once. this.serverOnCloseSubscribed is false to start with and only set in this one place.

@LRagji
Copy link
Author

LRagji commented Jun 10, 2024

Again this was long back but i remember the HttpProxyMiddleware class was getting initialised for every request, which was triggering a lot of handlers.. don't remember how but the default code itself was not working, is that first example working for you? also they released a new release in April so not sure if things in pipeline changed anywhere....

@rene-leanix
Copy link

I tried to reproduce it as described in your initial post by spinning up that simple app and executing more than 10 curl requests simultaneously against that endpoint, but all requests succeeded and now warning was logged. I also debugged the code you linked to and both the constructor and the server.on('close')-call are executed once. This was confirmed by adding logs to the both places in the http-proxy-middleware dist code.

Could your load-testing setup have triggered a restart of the server between requests?

@Jokero, @fetis, @khoale-groove or @f4z3k4s: can any of you confirm that this issue happens apart from createProxyMiddleware() being called multiple times in your code?

@LRagji
Copy link
Author

LRagji commented Jun 10, 2024

@rene-leanix What version are you testing?, and if your don't mind can you share your code which was calling create method over and over again? I can see a lot of changes that happened for those specific method 'create proxy' in the library for different version

@rene-leanix
Copy link

@LRagji, I am using the latest versions for http-proxy-middleware (3.0.0) and express (4.19.2), so maybe it was fixed recently.

Regarding the code calling the API, I used the quick & dirty

curl http://localhost:3000/api & curl http://localhost:3000/api & ... & curl http://localhost:3000/api

with more than 10 curl statements.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants