Beefy Boxes and Bandwidth Generously Provided by pair Networks
P is for Practical
 
PerlMonks  

Perl/mySQL question

by $CBAS (Scribe)
on Feb 20, 2001 at 02:10 UTC ( #59497=perlquestion: print w/replies, xml ) Need Help??

$CBAS has asked for the wisdom of the Perl Monks concerning the following question:

Hi,

I'm very new to database "stuff" so this is the best place I know to ask this (boy does that sound negative! ;-)).
Right now I'm figuring out the basics with mySQL but I will probably have to use a non-open source DB when this project takes off ... I just hope that doesn't mean I'll have to start learning all over again.

Here's the deal:
I'm working with 2 databases. They are seperated for security reasons.
One is on a private intranet database. It's constantly changing.
The other is online and part of a website where people can see what's happening.

The problem is that I need to synchronize the online database every x minutes and I have no idea how to do this :-(

Are there any tricks (Perl or SQL) to get the changed parts of a database so I don't need to copy the whole thing (which would be impossible)?
Should I just check EVERY item in it with a hash of the data that was in it last time?

Any links to good database information sites are appreciated because I'm just getting into it and I like to learn as much as possible, as fast as possible ;-)

thanks!
CBAS

Replies are listed 'Best First'.
Re: Perl/mySQL question
by marius (Hermit) on Feb 20, 2001 at 02:14 UTC
    I don't know that Perl is the (best) way to do this, but it can be done in Perl I'm sure... Howevever..
    If you are running MySQL 3.23.33 or higher, you may want to look at this site. This is arguably the better way to do database mirroring, and will likely be supported in the larger closed source database implementations. (You'll probably have to re-learn the replication aspect of Oracle/Informix/Sybase/etc.)

    -marius

      Aah, that's the type of info I'm looking for, a big thanks to you!

      CBAS

      Since the above mentioned link is the way that mysql provides for such a thing, I would recommend setting up one-way replication like this. However, if there is something magical about the database needing to be updated at a set interval, you could always use mysqldump on the secure host, then rebuild the web host from the dump without needing to close either host down.
Re: Perl/mySQL question
by arturo (Vicar) on Feb 20, 2001 at 02:35 UTC

    What you might try is keeping an incremental log of each completed transaction (yes, I believe MySQL only allows "transactions", properly so-called, via a somewhat hackish maneuver at the moment -- so read this as "UPDATE, INSERT or DELETE") that's been issued since the last time your script ran; presumably, this means you'll have to have access to the apps that are used for updating the internal database so they can write the logs. And be prepared to learn all about file locking, too, because you'll want the updaters to be unable to add to the file when you're reading from it to do the updates (any updates recorded while the script is running will have to go to some sort of temp file ... and thinking about the concurrency issues is making my head spin) ... Or maybe MySQL can do this on its own (I'm pretty sure Oracle etc. has such a facility).

    Then you simply issue those SQL commands to the publically accessible DB and ba da bing!

    Hope that's workable ..

    Philosophy can be made out of anything. Or less -- Jerry A. Fodor

      Thanks for the help but the software used to update the local DB is way too closed-source for this to work, It's impossible for me to add a track-keeping mechanism.

      Also, I was specifically told NOT to use file locking on the whole table, another problem.

      Thanks for the idea anyway,
      CBAS

Re: Perl/mySQL question
by CiceroLove (Monk) on Feb 20, 2001 at 22:18 UTC
    If you are using MySQL now but something later this may not be such a good suggestion but I am sure there will be an equivalent command. If the internal database is the database that will be the most up-to-date and the one on the Web is the one that you need to keep updated rthen why don't you just do a mysqldump and then a mysqlimport. We use ti a lot here and it is a simple one step in Perl. In other words, let mySQL do its job and just have Perl to remind it to do so?

    CiceroLove

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://59497]
Approved by root
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others cooling their heels in the Monastery: (3)
As of 2021-08-01 02:24 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?