0

I have a node.js app running on a Digital Ocean MEAN droplet. When over 300 people are accessing it at the same time it just hangs. At first I noticed that it was opening over 1000 files at once. I have moved a bunch of the static images to Amazon S3 to help with that, and upped the limit in ulimit (temp fix to stop the server from completely dying). I contacted Digital Ocean and they said the server should be able to handle the traffic.

Just wondering what steps I should be taking to figure out why this is happening.

Some things I am probably doing wrong:

  • I am still serving some static files from the node server. I plan on reducing this to as low as I can.
  • I don't have a reverse proxy setup for it (nginx, haproxy, etc).
  • The database is running on the same server as the node.js app.

If you want me to provide any additional info, please ask. I will gladly do so.

1 Answers1

1
  1. If you are serving static stuff, hand it off to nginx to serve stuff as @codephobia mentions
  2. look for any functions that have Sync and see if you can remove that
  3. are you closing off file streams? Off the top off my head, node has like 255 open handles to files and you could be running out of those.

Without looking at it I couldn't help. The next step for you sir is to get a new relic account and integrate it into your site to start logging issues and to get to the bottom of it.

Mark D
  • 181
  • 3
  • 1
    I actually signed up for new relic last night before I read this and it was plainly obvious that it just couldn't handle serving those static files. I moved them all to Amazon S3 and it dropped all the numbers into a great range. Apdex score sitting at .98 - .99. I have yet to test it with a lot of people - haven't had more than 100 on at once since the update, but it looks promising. – codephobia Dec 20 '14 at 11:06