|Perl: the Markov chain saw|
Re: Perl tk questionby arkturuz (Curate)
|on Jul 29, 2013 at 18:05 UTC||Need Help??|
the gui serves as a server.
This is good. There shouldn't be problems if the number of objects increases as long as any one of them doesn't lock the whole application.
It stores the results in MySQL database.
The database will be a bottleneck. 100 clients and 20000 is not exactly the same if they want to C, R, U or D every other second or so. Database will have to be optimized for a lot of traffic. You will probably have to pass all queries through small number of server processes dedicated for this. Or you can use some messaging system like RabbitQM or similar although you would probably have to rewrite a lot of code.
Other than that, it depends on what kind of jobs these programs are doing. Is it one process forked over 100 times, or a bunch of different process forked many times, or really 100 different processes? Do they respawn frequently on errors? (Like die() on every warn() and similar.) Is there a lot of network activity?
I would concentrate on ensuring that the database access is always working properly when increasing the number of clients, that network issues are handled properly with timeouts and that very frequent respawns in case of errors are detected as soon as possible.
This is pretty general, but it also depends on what kind of work your software is doing.