I guess I could requalify the question with something like "Has anyone ever written a useful, practical, program that used more than 4000 concurrent classes, or 1 million concurrent instances of any given class", but then you could say that your snippets were useful because they proved it could be done :)
An idea I am playing with uses a 32-bit value as a handle to object instances. That dword will indicate both the class, and the instance of that class. There are various ways that I could subdivide the bits. 64K instances x 64k classes. 16M x 256 etc. Looking at the possible splits, I've pretty much concluded that 12-bits for classes and 20-bits for instances is a reasonable split for most purposes, but was trying to elicit a responcse of "Yeah! My Finite Element Analysis (or Ray Tracing) program routinely has over a million concurrent instances of xxx class" or similar.
In theory, and probably practice, it would be possible to define the split at compile time. Given most current processors are 32-bit/4GB max memory, and each instance of a class is always going to be more than 1 byte, it is unlikely that any single program will create more that 2**32 concurrent instances of all classes. If, as, and when the program was ever moved to a 64-bit platform, 4GB instances of 4GB classes will suffice for most purposes.
So, it is just a case of trying to decide what would be the most suitable default split for the 32-bits?
Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"Think for yourself!" - Abigail
"Memory, processor, disk in that order on the hardware side. Algorithm, algorithm, algorithm on the code side." - tachyon
| [reply] |