As I was wandering around the world and the internet, I found a simple rule of thumb: 2 significant digits are sufficient to depict accurately enough a number of users for an information system.
As for why it is sufficient, let's take a look at 3 scales: systems with tens, thousands, and millions of simultaneous connections/active users.
For a system with tens of users, a single user can bring a significant change to the system behavior, for example, we can take gaming VoIP services like Mumble.
For a system with thousands of users, single or tens of users may not bring significant changes in behavior, but a hundred may bring enough effect to alter the average latency. As an example, sales_backend can handle 6000 users with a latency of less than 90ms, but 6100 make the latency climb to 110ms, and this goes exponentially.
For a system with millions of simultaneous users, and likely multiple servers to handle the load, what makes the difference is the in the way a handful of hundred thousand users will pull need for more servers and how they may put contention over caches systems and database systems. This will pull the need for widely scalable databases or microservices with separate databases.
For every case, 2 digits always appear as a good numbering precision for users.