How about double-logging:
Log to the local SQLite database, AND create an incrementing log file containing SQL statements.
A separate process (Either local, or on-request from the server) can periodically ship the incrementing file to the server, and delete them.
The server can apply the received logs and maintain the equivalent of a replicated database, to generate charts.
This has the advantage of being scalable for multiple logging sources, for centralized data processing (I'm an old-time mainframe guy).
I hope life isn't a big joke, because I don't get it.
-SNL
| [reply] [Watch: Dir/Any] |
Either or -- though its probably simpler to install all prerequisites on your dev machine, and to just upload images -- web hosting can be PITA | [reply] [Watch: Dir/Any] |
It depends. How much control do you have over the web-server? Is your webserver able to run Perl, SQLite and has it all necessary modules and GNUPlot installed? Do you need these charts to be dynamically generated at arbitrary times and for various periods or will you only offer standard charts for fixed periods? Is there enough disk-space available to keep a large number of pre-rendered charts on-line?
CountZero A program should be light and agile, its subroutines connected like a string of pearls. The spirit and intent of the program should be retained throughout. There should be neither too little or too much, neither needless loops nor useless variables, neither lack of structure nor overwhelming rigidity." - The Tao of Programming, 4.1 - Geoffrey James My blog: Imperial Deltronics
| [reply] [Watch: Dir/Any] |