|We don't bite newbies here... much|
Hi yes the 100000 read was only to do the test and to be able to evaluate the difference in time from one method to another.
The real thing dosent read so much record but its so creazy that we have to do a lot of work.
First the is a keyid who is keep in a table
There also is a table for the header and another table for the details.
To add an header I have to read the keyid table, get the keyid then update the table with keyid +1.
Then I can add the header with the new keyid.
To add a detail I have to do the same:
read the keyid table, get the keyid then update the table with keyid +1
then add the detail with the new keyid.
I know this probably look weird but its a package that we cannot modyfy so we have to work has they define their database.
So Now to gain performance since I know Ho many details i have to add, instead of getting the keyid and updating it by 1 each times I update it once by number of detail line + 1 (header).
Ie : if I have 10 details line + 1 header
I Read the keyId and add 12 to it then update the table.
I add the header with keyid-11
I add the detail line1 with keyid-10
I add the detail line10 with keyid