|XP is just a number|
Re(8): "The First Rule of Distributed Objects is..."by perrin (Chancellor)
|on Oct 23, 2003 at 20:18 UTC||Need Help??|
If you can optimize the heavy parts, everything gets quicker. Same reason you use profilers.
I think that's actually a different deal. Optimizing and profiling are about improving the efficiency of the most important pieces of code, reducing the amount of resources needed. Here we're just talking about allocating resources.
Point is, by keeping the heavy parts completely isolated from the quicker parts and paying attn to those heavy parts, things will always run fast. If those heavy parts get bogged down again, the quick parts stay quick.
I agree, if you can't afford enough resources to make your whole application run well, you can sacrifice part of its performance by under-allocating resources to it in order to keep another part that you consider more important running quickly. This is a popular idea on big iron systems where you can do things like pin a certain number of CPUs to one job. It doesn't require distributed objects though.
Totally agree on you, but putting some stuff on static pages isn't always an option. And load balancers do solve part of the problem, but not the total problem.
I was actually talking about dynamic pages there, not static ones, like you were in your login page and preferences page example. Good load-balancers do solve the problem of isolating specific URLs to run on specific groups of machines, and I wouldn't consider load-balancers or mod_proxy any harder to deal with than J2EE deployment and configuration stuff.
prolly something you wouldn't do in mod_perl but in more business directed languages, like java or even cobol :)
Hmmm... Java was created for programming toasters and refrigerators. It's a good general-purpose language, but the whole business slant is just marketing.