A good test is to run a simple test many times. Occasionally this finds things like memory leaks, file handle leaks, temporary file leaks, and so on.

If you have a simple web server for static content, and it works fine for a few requests, and seems fast when serving a thousand requests, you could try running a million requests. Or a billion. Not necessarily by many concurrent clients, although that'd be a good test too, just sequential requests.

I found a limitation in ab that way.


Extremely true for storage systems. Writing a file wotks fine. Writing a hundred files works fine. Filling up the storage with files and suddenly something goes wrong at 80% ot 98%...

Sign in to participate in the conversation
Mastodon @ SUNET

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!